[ 535.304247] env[59335]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 535.760870] env[59379]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 537.295456] env[59379]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=59379) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 537.295781] env[59379]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=59379) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 537.295815] env[59379]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=59379) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 537.296129] env[59379]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 537.297200] env[59379]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 537.413333] env[59379]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=59379) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 537.423499] env[59379]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=59379) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 537.529930] env[59379]: INFO nova.virt.driver [None req-17e41200-cec2-42cc-af13-0e9e5bbaf082 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 537.641243] env[59379]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.641408] env[59379]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 537.641521] env[59379]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=59379) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 540.970017] env[59379]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-6628d728-af1b-42c7-b57b-4e1c36065761 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 540.985403] env[59379]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=59379) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 540.985571] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-f435d6e7-5750-41df-a4f9-1166762ddd4f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.009350] env[59379]: INFO oslo_vmware.api [-] Successfully established new session; session ID is d72fe. [ 541.009511] env[59379]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.368s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 541.010222] env[59379]: INFO nova.virt.vmwareapi.driver [None req-17e41200-cec2-42cc-af13-0e9e5bbaf082 None None] VMware vCenter version: 7.0.3 [ 541.013685] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14bb042b-7e74-4f81-9934-6b17a6126a75 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.031178] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2063fea7-3d90-4bc6-8b56-14e83420cb4e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.037304] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b6220d5-c6ed-494e-aae6-9851281abb85 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.044134] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54c5e093-6e56-46eb-9079-23d2ea0bf9ce {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.057410] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0312184-c584-475d-8697-a8cbce3e27db {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.063842] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a15c6349-440a-4fd2-b8f2-b1db6cab4edd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.094550] env[59379]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-e97bba4e-750f-4768-bdfa-cd9a542d381c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.099494] env[59379]: DEBUG nova.virt.vmwareapi.driver [None req-17e41200-cec2-42cc-af13-0e9e5bbaf082 None None] Extension org.openstack.compute already exists. {{(pid=59379) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 541.102199] env[59379]: INFO nova.compute.provider_config [None req-17e41200-cec2-42cc-af13-0e9e5bbaf082 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 541.119203] env[59379]: DEBUG nova.context [None req-17e41200-cec2-42cc-af13-0e9e5bbaf082 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),11743078-64bf-4468-a786-557c60808969(cell1) {{(pid=59379) load_cells /opt/stack/nova/nova/context.py:464}} [ 541.121106] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 541.121318] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 541.122061] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 541.122422] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Acquiring lock "11743078-64bf-4468-a786-557c60808969" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 541.122605] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Lock "11743078-64bf-4468-a786-557c60808969" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 541.123565] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Lock "11743078-64bf-4468-a786-557c60808969" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 541.136407] env[59379]: DEBUG oslo_db.sqlalchemy.engines [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59379) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 541.136776] env[59379]: DEBUG oslo_db.sqlalchemy.engines [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59379) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 541.143487] env[59379]: ERROR nova.db.main.api [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 541.143487] env[59379]: result = function(*args, **kwargs) [ 541.143487] env[59379]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 541.143487] env[59379]: return func(*args, **kwargs) [ 541.143487] env[59379]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 541.143487] env[59379]: result = fn(*args, **kwargs) [ 541.143487] env[59379]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 541.143487] env[59379]: return f(*args, **kwargs) [ 541.143487] env[59379]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 541.143487] env[59379]: return db.service_get_minimum_version(context, binaries) [ 541.143487] env[59379]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 541.143487] env[59379]: _check_db_access() [ 541.143487] env[59379]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 541.143487] env[59379]: stacktrace = ''.join(traceback.format_stack()) [ 541.143487] env[59379]: [ 541.144480] env[59379]: ERROR nova.db.main.api [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 541.144480] env[59379]: result = function(*args, **kwargs) [ 541.144480] env[59379]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 541.144480] env[59379]: return func(*args, **kwargs) [ 541.144480] env[59379]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 541.144480] env[59379]: result = fn(*args, **kwargs) [ 541.144480] env[59379]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 541.144480] env[59379]: return f(*args, **kwargs) [ 541.144480] env[59379]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 541.144480] env[59379]: return db.service_get_minimum_version(context, binaries) [ 541.144480] env[59379]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 541.144480] env[59379]: _check_db_access() [ 541.144480] env[59379]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 541.144480] env[59379]: stacktrace = ''.join(traceback.format_stack()) [ 541.144480] env[59379]: [ 541.144955] env[59379]: WARNING nova.objects.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 541.144955] env[59379]: WARNING nova.objects.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Failed to get minimum service version for cell 11743078-64bf-4468-a786-557c60808969 [ 541.145374] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Acquiring lock "singleton_lock" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 541.145525] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Acquired lock "singleton_lock" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 541.145757] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Releasing lock "singleton_lock" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 541.146130] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Full set of CONF: {{(pid=59379) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 541.146273] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ******************************************************************************** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 541.146395] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] Configuration options gathered from: {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 541.146524] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 541.146716] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 541.146839] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ================================================================================ {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 541.147054] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] allow_resize_to_same_host = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.147221] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] arq_binding_timeout = 300 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.147347] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] backdoor_port = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.147470] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] backdoor_socket = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.147625] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] block_device_allocate_retries = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.147783] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] block_device_allocate_retries_interval = 3 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.147943] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cert = self.pem {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.148116] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.148280] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute_monitors = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.148441] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] config_dir = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.148603] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] config_drive_format = iso9660 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.148735] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.148891] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] config_source = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.149099] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] console_host = devstack {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.149266] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] control_exchange = nova {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.149418] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cpu_allocation_ratio = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.149570] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] daemon = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.149759] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] debug = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.149907] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] default_access_ip_network_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.150085] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] default_availability_zone = nova {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.150237] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] default_ephemeral_format = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.150467] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.150623] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] default_schedule_zone = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.150825] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] disk_allocation_ratio = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.150991] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] enable_new_services = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.151183] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] enabled_apis = ['osapi_compute'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.151345] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] enabled_ssl_apis = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.151500] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] flat_injected = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.151654] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] force_config_drive = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.151806] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] force_raw_images = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.151989] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] graceful_shutdown_timeout = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.152184] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] heal_instance_info_cache_interval = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.152394] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] host = cpu-1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.152560] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.152721] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] initial_disk_allocation_ratio = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.152878] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] initial_ram_allocation_ratio = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.153107] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.153271] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] instance_build_timeout = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.153428] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] instance_delete_interval = 300 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.153590] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] instance_format = [instance: %(uuid)s] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.153753] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] instance_name_template = instance-%08x {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.153904] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] instance_usage_audit = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.154079] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] instance_usage_audit_period = month {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.154240] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.154398] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] instances_path = /opt/stack/data/nova/instances {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.154558] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] internal_service_availability_zone = internal {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.154709] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] key = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.154865] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] live_migration_retry_count = 30 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.155045] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] log_config_append = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.155226] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.155383] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] log_dir = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.155535] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] log_file = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.155657] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] log_options = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.155815] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] log_rotate_interval = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.155981] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] log_rotate_interval_type = days {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.156156] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] log_rotation_type = none {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.156279] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.156399] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.156559] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.156719] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.156844] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.157028] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] long_rpc_timeout = 1800 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.157182] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] max_concurrent_builds = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.157335] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] max_concurrent_live_migrations = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.157492] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] max_concurrent_snapshots = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.157658] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] max_local_block_devices = 3 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.157811] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] max_logfile_count = 30 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.157967] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] max_logfile_size_mb = 200 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.158154] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] maximum_instance_delete_attempts = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.158323] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] metadata_listen = 0.0.0.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.158486] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] metadata_listen_port = 8775 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.158650] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] metadata_workers = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.158808] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] migrate_max_retries = -1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.158971] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] mkisofs_cmd = genisoimage {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.159186] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] my_block_storage_ip = 10.180.1.21 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.159312] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] my_ip = 10.180.1.21 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.159467] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] network_allocate_retries = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.159637] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.159829] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] osapi_compute_listen = 0.0.0.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.160016] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] osapi_compute_listen_port = 8774 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.160186] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] osapi_compute_unique_server_name_scope = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.160350] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] osapi_compute_workers = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.160503] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] password_length = 12 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.160657] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] periodic_enable = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.160848] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] periodic_fuzzy_delay = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.161043] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] pointer_model = usbtablet {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.161214] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] preallocate_images = none {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.161368] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] publish_errors = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.161492] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] pybasedir = /opt/stack/nova {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.161642] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ram_allocation_ratio = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.161795] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] rate_limit_burst = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.161954] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] rate_limit_except_level = CRITICAL {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.162119] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] rate_limit_interval = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.162273] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] reboot_timeout = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.162424] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] reclaim_instance_interval = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.162573] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] record = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.162732] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] reimage_timeout_per_gb = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.162922] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] report_interval = 120 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.163104] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] rescue_timeout = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.163262] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] reserved_host_cpus = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.163415] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] reserved_host_disk_mb = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.163566] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] reserved_host_memory_mb = 512 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.163728] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] reserved_huge_pages = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.163909] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] resize_confirm_window = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.164086] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] resize_fs_using_block_device = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.164243] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] resume_guests_state_on_host_boot = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.164406] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.164562] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] rpc_response_timeout = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.164716] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] run_external_periodic_tasks = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.164876] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] running_deleted_instance_action = reap {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.165042] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] running_deleted_instance_poll_interval = 1800 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.165197] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] running_deleted_instance_timeout = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.165350] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler_instance_sync_interval = 120 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.165479] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_down_time = 300 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.165639] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] servicegroup_driver = db {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.165794] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] shelved_offload_time = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.165951] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] shelved_poll_interval = 3600 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.166137] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] shutdown_timeout = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.166293] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] source_is_ipv6 = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.166445] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ssl_only = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.166680] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.166859] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] sync_power_state_interval = 600 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.167042] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] sync_power_state_pool_size = 1000 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.167213] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] syslog_log_facility = LOG_USER {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.167363] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] tempdir = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.167517] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] timeout_nbd = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.167698] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] transport_url = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.167866] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] update_resources_interval = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.168221] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] use_cow_images = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.168221] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] use_eventlog = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.168348] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] use_journal = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.168481] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] use_json = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.168633] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] use_rootwrap_daemon = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.168786] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] use_stderr = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.168938] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] use_syslog = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.169101] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vcpu_pin_set = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.169262] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plugging_is_fatal = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.169423] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plugging_timeout = 300 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.169580] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] virt_mkfs = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.169758] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] volume_usage_poll_interval = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.169939] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] watch_log_file = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.170133] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] web = /usr/share/spice-html5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 541.170317] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_concurrency.disable_process_locking = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.170599] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.170773] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.170936] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.171114] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.171277] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.171436] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.171611] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.auth_strategy = keystone {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.171770] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.compute_link_prefix = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.171940] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.172119] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.dhcp_domain = novalocal {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.172282] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.enable_instance_password = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.172438] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.glance_link_prefix = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.172593] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.172755] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.172932] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.instance_list_per_project_cells = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.173115] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.list_records_by_skipping_down_cells = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.173275] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.local_metadata_per_cell = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.173445] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.max_limit = 1000 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.173609] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.metadata_cache_expiration = 15 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.173776] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.neutron_default_tenant_id = default {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.173939] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.use_forwarded_for = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.174111] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.use_neutron_default_nets = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.174275] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.174430] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.174590] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.174755] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.174921] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.vendordata_dynamic_targets = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.175096] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.vendordata_jsonfile_path = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.175276] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.175461] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.backend = dogpile.cache.memcached {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.175625] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.backend_argument = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.175788] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.config_prefix = cache.oslo {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.175970] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.dead_timeout = 60.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.176156] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.debug_cache_backend = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.176315] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.enable_retry_client = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.176470] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.enable_socket_keepalive = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.176633] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.enabled = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.176794] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.expiration_time = 600 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.176952] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.hashclient_retry_attempts = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.177125] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.hashclient_retry_delay = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.177284] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.memcache_dead_retry = 300 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.177443] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.memcache_password = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.177598] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.177780] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.177944] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.memcache_pool_maxsize = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.178114] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.178275] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.memcache_sasl_enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.178448] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.178607] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.memcache_socket_timeout = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.178767] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.memcache_username = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.178936] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.proxies = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.179136] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.retry_attempts = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.179301] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.retry_delay = 0.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.179463] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.socket_keepalive_count = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.179615] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.socket_keepalive_idle = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.179802] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.socket_keepalive_interval = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.179973] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.tls_allowed_ciphers = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.180140] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.tls_cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.180292] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.tls_certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.180447] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.tls_enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.180598] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cache.tls_keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.180777] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.auth_section = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.180960] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.auth_type = password {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.181128] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.181296] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.catalog_info = volumev3::publicURL {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.181447] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.181601] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.181755] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.cross_az_attach = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.181955] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.debug = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.182195] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.endpoint_template = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.182372] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.http_retries = 3 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.182533] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.182688] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.182853] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.os_region_name = RegionOne {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.183021] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.183183] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cinder.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.183349] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.183502] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.cpu_dedicated_set = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.183654] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.cpu_shared_set = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.183812] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.image_type_exclude_list = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.183969] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.184141] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.max_concurrent_disk_ops = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.184307] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.max_disk_devices_to_attach = -1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.184467] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.184627] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.184785] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.resource_provider_association_refresh = 300 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.184961] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.shutdown_retry_interval = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.185162] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.185336] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] conductor.workers = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.185505] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] console.allowed_origins = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.185657] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] console.ssl_ciphers = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.185820] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] console.ssl_minimum_version = default {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.185988] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] consoleauth.token_ttl = 600 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.186168] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.186319] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.186475] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.186628] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.connect_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.186781] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.connect_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.186931] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.endpoint_override = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.187098] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.187257] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.187410] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.max_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.187558] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.min_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.187729] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.region_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.187892] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.service_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.188093] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.service_type = accelerator {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.188257] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.188411] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.status_code_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.188561] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.status_code_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.188712] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.188887] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.189056] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] cyborg.version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.189234] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.backend = sqlalchemy {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.189410] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.connection = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.189574] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.connection_debug = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.189758] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.connection_parameters = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.189927] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.connection_recycle_time = 3600 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.190108] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.connection_trace = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.190273] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.db_inc_retry_interval = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.190431] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.db_max_retries = 20 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.190603] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.db_max_retry_interval = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.190754] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.db_retry_interval = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.190941] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.max_overflow = 50 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.191115] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.max_pool_size = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.191281] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.max_retries = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.191438] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.mysql_enable_ndb = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.191600] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.191867] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.mysql_wsrep_sync_wait = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.191904] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.pool_timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.192068] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.retry_interval = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.192231] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.slave_connection = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.192384] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.sqlite_synchronous = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.192543] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] database.use_db_reconnect = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.192714] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.backend = sqlalchemy {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.193166] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.connection = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.193355] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.connection_debug = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.193529] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.connection_parameters = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.193693] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.connection_recycle_time = 3600 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.193862] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.connection_trace = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.194036] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.db_inc_retry_interval = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.194205] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.db_max_retries = 20 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.194367] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.db_max_retry_interval = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.194527] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.db_retry_interval = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.194696] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.max_overflow = 50 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.194857] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.max_pool_size = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.195033] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.max_retries = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.195198] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.mysql_enable_ndb = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.195363] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.195519] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.195677] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.pool_timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.195843] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.retry_interval = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.196009] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.slave_connection = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.196178] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] api_database.sqlite_synchronous = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.196349] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] devices.enabled_mdev_types = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.196521] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.196681] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ephemeral_storage_encryption.enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.196843] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.197061] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.api_servers = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198245] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198245] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198245] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198245] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.connect_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198245] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.connect_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198245] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.debug = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198245] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.default_trusted_certificate_ids = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198917] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.enable_certificate_validation = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198917] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.enable_rbd_download = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198917] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.endpoint_override = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198917] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.198917] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.199065] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.max_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.199195] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.min_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.199297] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.num_retries = 3 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.199458] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.rbd_ceph_conf = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.199611] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.rbd_connect_timeout = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.199803] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.rbd_pool = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.199978] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.rbd_user = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202064] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.region_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202064] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.service_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202064] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.service_type = image {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202064] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202064] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.status_code_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202064] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.status_code_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202064] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202486] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202486] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.verify_glance_signatures = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202486] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] glance.version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202486] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] guestfs.debug = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202486] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.config_drive_cdrom = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202486] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.config_drive_inject_password = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202486] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202804] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.enable_instance_metrics_collection = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202804] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.enable_remotefx = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.202804] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.instances_path_share = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.203023] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.iscsi_initiator_list = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.203023] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.limit_cpu_features = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.203148] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.203307] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.203473] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.power_state_check_timeframe = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.203630] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.203794] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.203953] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.use_multipath_io = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.204126] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.volume_attach_retry_count = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.204282] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.204435] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.vswitch_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.204629] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.204757] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] mks.enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.205148] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.205338] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] image_cache.manager_interval = 2400 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.205502] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] image_cache.precache_concurrency = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.205666] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] image_cache.remove_unused_base_images = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.205831] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.205997] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.206185] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] image_cache.subdirectory_name = _base {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.206357] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.api_max_retries = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.206519] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.api_retry_interval = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.206678] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.auth_section = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.206835] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.auth_type = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.206993] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.207173] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.207335] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.207492] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.connect_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.207648] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.connect_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.207804] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.endpoint_override = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.207962] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.208128] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.208282] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.max_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.208435] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.min_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.208588] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.partition_key = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.208746] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.peer_list = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.208901] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.region_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.209071] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.serial_console_state_timeout = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.209229] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.service_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.209397] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.service_type = baremetal {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.209554] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.209708] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.status_code_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.209891] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.status_code_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.210065] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.210247] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.210406] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ironic.version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.210582] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.210764] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] key_manager.fixed_key = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.210958] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.211135] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.barbican_api_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.211291] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.barbican_endpoint = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.211458] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.barbican_endpoint_type = public {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.211612] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.barbican_region_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.211766] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.211923] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.212113] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.212296] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.212447] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.212608] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.number_of_retries = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.212769] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.retry_delay = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.212934] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.send_service_user_token = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.213099] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.213256] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.213411] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.verify_ssl = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.213567] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican.verify_ssl_path = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.213726] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican_service_user.auth_section = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.213922] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican_service_user.auth_type = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.214047] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican_service_user.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.214205] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican_service_user.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.214362] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican_service_user.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.214518] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican_service_user.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.214671] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican_service_user.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.214830] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican_service_user.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.214983] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] barbican_service_user.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.215162] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.approle_role_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.215316] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.approle_secret_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.215469] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.215630] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.215790] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.215948] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.216137] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.216315] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.kv_mountpoint = secret {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.216474] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.kv_version = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.216627] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.namespace = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.216781] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.root_token_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.216966] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.217153] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.ssl_ca_crt_file = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218017] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218017] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.use_ssl = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218017] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218017] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218017] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218275] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218385] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.connect_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218476] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.connect_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218636] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.endpoint_override = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218796] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.218949] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.219115] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.max_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.219267] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.min_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.219417] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.region_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.219567] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.service_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.219735] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.service_type = identity {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.219911] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.220108] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.status_code_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.220241] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.status_code_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.220426] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.220650] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.220822] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] keystone.version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.221026] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.connection_uri = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.221191] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.cpu_mode = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.221355] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.cpu_model_extra_flags = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.221520] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.cpu_models = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.221685] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.cpu_power_governor_high = performance {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.221847] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.cpu_power_governor_low = powersave {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.222025] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.cpu_power_management = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.222195] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.222359] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.device_detach_attempts = 8 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.222518] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.device_detach_timeout = 20 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.222677] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.disk_cachemodes = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.222832] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.disk_prefix = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.222995] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.enabled_perf_events = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.223170] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.file_backed_memory = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.223339] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.gid_maps = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.223496] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.hw_disk_discard = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.223649] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.hw_machine_type = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.223811] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.images_rbd_ceph_conf = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.223968] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.224143] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.224303] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.images_rbd_glance_store_name = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.224462] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.images_rbd_pool = rbd {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.224621] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.images_type = default {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.224771] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.images_volume_group = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.224929] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.inject_key = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.225092] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.inject_partition = -2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.225585] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.inject_password = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.225585] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.iscsi_iface = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.225585] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.iser_use_multipath = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.225820] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_bandwidth = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.225875] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.226018] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_downtime = 500 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.226176] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.226330] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.226482] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_inbound_addr = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230078] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230078] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_permit_post_copy = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230078] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_scheme = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230078] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_timeout_action = abort {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230078] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_tunnelled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230078] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_uri = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230460] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.live_migration_with_native_tls = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230460] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.max_queues = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230460] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230460] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.nfs_mount_options = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230460] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230460] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230460] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.num_iser_scan_tries = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230640] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.num_memory_encrypted_guests = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230640] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230640] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.num_pcie_ports = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230640] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.num_volume_scan_tries = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230640] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.pmem_namespaces = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230640] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.quobyte_client_cfg = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230640] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230829] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.rbd_connect_timeout = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230829] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230829] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230829] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.rbd_secret_uuid = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230829] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.rbd_user = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.230829] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.231038] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.remote_filesystem_transport = ssh {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.231247] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.rescue_image_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.231295] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.rescue_kernel_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.231436] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.rescue_ramdisk_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.231595] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.234728] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.rx_queue_size = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.234728] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.smbfs_mount_options = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.234728] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.234728] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.snapshot_compression = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.234728] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.snapshot_image_format = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.234728] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235067] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.sparse_logical_volumes = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235067] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.swtpm_enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235067] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.swtpm_group = tss {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235067] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.swtpm_user = tss {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235067] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.sysinfo_serial = unique {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235067] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.tx_queue_size = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235067] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.uid_maps = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235302] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.use_virtio_for_bridges = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235302] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.virt_type = kvm {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235302] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.volume_clear = zero {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235302] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.volume_clear_size = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235302] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.volume_use_multipath = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235302] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.vzstorage_cache_path = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235447] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235447] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.vzstorage_mount_group = qemu {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235447] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.vzstorage_mount_opts = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235447] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235718] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235792] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.vzstorage_mount_user = stack {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.235943] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.236115] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.auth_section = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.236281] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.auth_type = password {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.236435] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.236586] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.236742] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.236895] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.connect_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.237058] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.connect_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.237219] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.default_floating_pool = public {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.237729] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.endpoint_override = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.237729] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.extension_sync_interval = 600 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.237729] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.http_retries = 3 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.237931] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.238867] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.238867] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.max_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.238867] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.238867] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.min_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.238867] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.ovs_bridge = br-int {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.238867] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.physnets = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.239212] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.region_name = RegionOne {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.239212] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.service_metadata_proxy = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.239301] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.service_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.239433] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.service_type = network {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.239587] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.239748] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.status_code_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.239921] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.status_code_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.240087] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.240264] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.240419] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] neutron.version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.240583] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] notifications.bdms_in_notifications = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.240774] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] notifications.default_level = INFO {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.241818] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] notifications.notification_format = unversioned {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.241818] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] notifications.notify_on_state_change = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.241818] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.241818] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] pci.alias = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.241818] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] pci.device_spec = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.241818] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] pci.report_in_placement = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.242174] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.auth_section = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.242174] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.auth_type = password {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.242268] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.242412] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.242561] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.242715] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.242863] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.connect_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.243024] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.connect_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.243175] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.default_domain_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.243537] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.default_domain_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.243537] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.domain_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.243688] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.domain_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.243769] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.endpoint_override = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.243923] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.244081] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.244232] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.max_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.244380] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.min_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.244536] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.password = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.244680] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.project_domain_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.244834] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.project_domain_name = Default {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.244992] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.project_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.245174] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.project_name = service {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.245339] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.region_name = RegionOne {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.245493] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.service_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.245652] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.service_type = placement {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.245809] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.245963] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.status_code_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.246131] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.status_code_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.246286] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.system_scope = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.246436] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.246587] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.trust_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.246737] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.user_domain_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.246898] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.user_domain_name = Default {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.247070] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.user_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.247240] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.username = placement {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.247417] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.247572] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] placement.version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.247772] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.cores = 20 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.247942] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.count_usage_from_placement = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.248125] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.248297] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.injected_file_content_bytes = 10240 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.248461] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.injected_file_path_length = 255 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.248622] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.injected_files = 5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.248786] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.instances = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.248949] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.key_pairs = 100 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.249125] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.metadata_items = 128 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.249285] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.ram = 51200 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.249443] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.recheck_quota = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.249606] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.server_group_members = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.249789] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] quota.server_groups = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.249973] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] rdp.enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.250293] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.250472] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.250637] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.250814] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.image_metadata_prefilter = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.251053] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.251168] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.max_attempts = 3 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.251316] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.max_placement_results = 1000 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.251475] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.251632] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.query_placement_for_availability_zone = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.251790] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.query_placement_for_image_type_support = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.251949] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.252133] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] scheduler.workers = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.252307] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.252483] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.252661] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.252828] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.252995] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.253173] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.253332] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.253514] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.253678] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.host_subset_size = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.253836] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.253997] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.254174] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.isolated_hosts = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.254332] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.isolated_images = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.254492] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.254649] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.254806] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.pci_in_placement = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.254964] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.255135] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.255294] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.255451] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.255609] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.255771] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.255932] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.track_instance_changes = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.256115] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.256285] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] metrics.required = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.256457] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] metrics.weight_multiplier = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.256616] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.256777] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] metrics.weight_setting = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.257093] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.257266] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] serial_console.enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.257446] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] serial_console.port_range = 10000:20000 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.257623] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.257821] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.257990] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] serial_console.serialproxy_port = 6083 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.258178] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_user.auth_section = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.258347] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_user.auth_type = password {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.258504] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_user.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.258656] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_user.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.258814] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_user.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.258971] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_user.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.259138] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_user.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.259305] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_user.send_service_user_token = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.259462] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_user.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.259616] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] service_user.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.259835] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.agent_enabled = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.260030] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.260335] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.260523] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.260690] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.html5proxy_port = 6082 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.260893] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.image_compression = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.261069] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.jpeg_compression = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.261230] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.playback_compression = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.261398] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.server_listen = 127.0.0.1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.261560] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.261713] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.streaming_mode = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.261866] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] spice.zlib_compression = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.262052] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] upgrade_levels.baseapi = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.262205] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] upgrade_levels.cert = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.262369] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] upgrade_levels.compute = auto {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.262524] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] upgrade_levels.conductor = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.262685] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] upgrade_levels.scheduler = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.262851] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vendordata_dynamic_auth.auth_section = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.263016] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vendordata_dynamic_auth.auth_type = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.263176] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vendordata_dynamic_auth.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.263329] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vendordata_dynamic_auth.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.263487] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.263644] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vendordata_dynamic_auth.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.263798] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vendordata_dynamic_auth.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.263959] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.264124] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vendordata_dynamic_auth.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.264297] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.api_retry_count = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.264452] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.ca_file = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.264617] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.cache_prefix = devstack-image-cache {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.264779] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.cluster_name = testcl1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.264942] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.connection_pool_size = 10 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.265108] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.console_delay_seconds = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.265274] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.datastore_regex = ^datastore.* {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.265478] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.265648] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.host_password = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.265809] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.host_port = 443 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.265974] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.host_username = administrator@vsphere.local {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.266155] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.insecure = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.266312] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.integration_bridge = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.266469] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.maximum_objects = 100 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.266619] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.pbm_default_policy = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.266776] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.pbm_enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.266931] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.pbm_wsdl_location = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.267105] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.267260] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.serial_port_proxy_uri = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.267409] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.serial_port_service_uri = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.267567] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.task_poll_interval = 0.5 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.267759] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.use_linked_clone = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.267939] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.vnc_keymap = en-us {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.268112] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.vnc_port = 5900 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.268274] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vmware.vnc_port_total = 10000 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.268463] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vnc.auth_schemes = ['none'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.268634] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vnc.enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.268939] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.269141] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.269310] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vnc.novncproxy_port = 6080 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.269527] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vnc.server_listen = 127.0.0.1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.269751] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.269931] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vnc.vencrypt_ca_certs = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.270109] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vnc.vencrypt_client_cert = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.270268] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vnc.vencrypt_client_key = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.270453] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.270615] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.270782] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.270936] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.271107] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.disable_rootwrap = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.271277] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.enable_numa_live_migration = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.271434] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.271591] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.271748] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.271905] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.libvirt_disable_apic = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.272073] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.272236] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274023] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274023] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274023] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274023] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274023] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274023] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274226] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274226] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274226] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274226] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.client_socket_timeout = 900 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274226] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.default_pool_size = 1000 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274226] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.keep_alive = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274377] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.max_header_line = 16384 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274377] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.secure_proxy_ssl_header = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274537] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.ssl_ca_file = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274669] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.ssl_cert_file = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274828] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.ssl_key_file = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.274984] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.tcp_keepidle = 600 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.275158] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.275851] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] zvm.ca_file = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.275851] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] zvm.cloud_connector_url = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.275851] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.276052] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] zvm.reachable_timeout = 300 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.276108] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_policy.enforce_new_defaults = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277579] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_policy.enforce_scope = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277579] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_policy.policy_default_rule = default {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277579] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277579] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_policy.policy_file = policy.yaml {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277579] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277579] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277870] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277870] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277870] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277870] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.277870] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.278077] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] profiler.connection_string = messaging:// {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.278178] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] profiler.enabled = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.278350] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] profiler.es_doc_type = notification {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.278491] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] profiler.es_scroll_size = 10000 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.278651] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] profiler.es_scroll_time = 2m {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.278804] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] profiler.filter_error_trace = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.278960] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] profiler.hmac_keys = SECRET_KEY {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.279540] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] profiler.sentinel_service_name = mymaster {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.279540] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] profiler.socket_timeout = 0.1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.279540] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] profiler.trace_sqlalchemy = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.280175] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] remote_debug.host = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.280175] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] remote_debug.port = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.280175] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.280175] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.280405] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.280405] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.280572] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.280738] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.280925] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.281103] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.281267] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.281421] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.281588] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.281751] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.281917] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.282091] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.282253] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.282420] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.282577] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.282736] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.282898] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.283068] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.283230] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.283395] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.283553] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.283712] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.283877] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.284053] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.ssl = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.284227] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.284394] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.284553] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.284722] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.284892] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_rabbit.ssl_version = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.285094] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.285260] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_notifications.retry = -1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.285441] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.285610] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_messaging_notifications.transport_url = **** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.285778] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.auth_section = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.285939] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.auth_type = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.286102] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.cafile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.286252] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.certfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.286407] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.collect_timing = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.286559] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.connect_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.286712] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.connect_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.286867] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.endpoint_id = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.287031] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.endpoint_override = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.287192] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.insecure = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.287341] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.keyfile = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.287491] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.max_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.287657] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.min_version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.287831] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.region_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.287988] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.service_name = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.288184] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.service_type = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.288350] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.split_loggers = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.288516] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.status_code_retries = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.288671] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.status_code_retry_delay = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.288826] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.timeout = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.288984] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.valid_interfaces = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.289155] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_limit.version = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.289319] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_reports.file_event_handler = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.289480] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.289634] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] oslo_reports.log_dir = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.289823] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.289989] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.290161] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.290325] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.290486] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.290642] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.290843] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.291026] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_ovs_privileged.group = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.291188] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.291352] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.291512] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.291663] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] vif_plug_ovs_privileged.user = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.291829] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_linux_bridge.flat_interface = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.292021] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.292191] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.292356] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.292522] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.292682] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.292843] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.293006] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.293187] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_ovs.isolate_vif = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.293351] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.293511] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.293677] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.293843] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_ovs.ovsdb_interface = native {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.293998] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_vif_ovs.per_port_bridge = False {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.294169] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] os_brick.lock_path = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.294333] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] privsep_osbrick.capabilities = [21] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.294484] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] privsep_osbrick.group = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.294633] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] privsep_osbrick.helper_command = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.294792] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.294950] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.295114] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] privsep_osbrick.user = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.295282] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.295435] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] nova_sys_admin.group = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.295585] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] nova_sys_admin.helper_command = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.295740] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.295899] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.296061] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] nova_sys_admin.user = None {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 541.296193] env[59379]: DEBUG oslo_service.service [None req-eb0045a4-6348-4e3a-8c8b-9068b628b727 None None] ******************************************************************************** {{(pid=59379) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 541.296605] env[59379]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 541.307636] env[59379]: INFO nova.virt.node [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Generated node identity 693f1d2b-e627-44fb-bcd5-714cccac894b [ 541.307884] env[59379]: INFO nova.virt.node [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Wrote node identity 693f1d2b-e627-44fb-bcd5-714cccac894b to /opt/stack/data/n-cpu-1/compute_id [ 541.320677] env[59379]: WARNING nova.compute.manager [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Compute nodes ['693f1d2b-e627-44fb-bcd5-714cccac894b'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 541.355505] env[59379]: INFO nova.compute.manager [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 541.378524] env[59379]: WARNING nova.compute.manager [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 541.378747] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 541.378945] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 541.379095] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 541.379245] env[59379]: DEBUG nova.compute.resource_tracker [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 541.380493] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5df1eac8-19ab-4439-b661-ec786b2b44ae {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.388929] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25be8b19-6886-4e34-920e-2346ff87fc2a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.403014] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47083522-9e18-4b76-ac2a-55ae189e30e6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.408990] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9f20f38-e886-4459-84ab-86e950ae914e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.437065] env[59379]: DEBUG nova.compute.resource_tracker [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181724MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 541.437200] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 541.437379] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 541.449223] env[59379]: WARNING nova.compute.resource_tracker [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] No compute node record for cpu-1:693f1d2b-e627-44fb-bcd5-714cccac894b: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 693f1d2b-e627-44fb-bcd5-714cccac894b could not be found. [ 541.462125] env[59379]: INFO nova.compute.resource_tracker [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 693f1d2b-e627-44fb-bcd5-714cccac894b [ 541.509801] env[59379]: DEBUG nova.compute.resource_tracker [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 541.509983] env[59379]: DEBUG nova.compute.resource_tracker [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 541.612136] env[59379]: INFO nova.scheduler.client.report [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] [req-b5c35a7f-8883-4aaf-93a1-254b018ab0bc] Created resource provider record via placement API for resource provider with UUID 693f1d2b-e627-44fb-bcd5-714cccac894b and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 541.627350] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb2e8aa8-e46c-4248-8ec6-e162f57312fe {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.634636] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8c5bbf8-a33f-4d9e-8517-75de69dbceb0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.663316] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29adb7b4-6d3d-45c0-94db-be79b4ec04ee {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.669815] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a665216c-f6ab-425e-9f9b-024946505567 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.682345] env[59379]: DEBUG nova.compute.provider_tree [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Updating inventory in ProviderTree for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 541.715712] env[59379]: DEBUG nova.scheduler.client.report [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Updated inventory for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 541.715925] env[59379]: DEBUG nova.compute.provider_tree [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Updating resource provider 693f1d2b-e627-44fb-bcd5-714cccac894b generation from 0 to 1 during operation: update_inventory {{(pid=59379) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 541.716079] env[59379]: DEBUG nova.compute.provider_tree [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Updating inventory in ProviderTree for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 541.755622] env[59379]: DEBUG nova.compute.provider_tree [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Updating resource provider 693f1d2b-e627-44fb-bcd5-714cccac894b generation from 1 to 2 during operation: update_traits {{(pid=59379) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 541.773992] env[59379]: DEBUG nova.compute.resource_tracker [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 541.774185] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 541.774337] env[59379]: DEBUG nova.service [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Creating RPC server for service compute {{(pid=59379) start /opt/stack/nova/nova/service.py:182}} [ 541.787951] env[59379]: DEBUG nova.service [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] Join ServiceGroup membership for this service compute {{(pid=59379) start /opt/stack/nova/nova/service.py:199}} [ 541.788146] env[59379]: DEBUG nova.servicegroup.drivers.db [None req-e85da548-81fc-4bbc-a450-78f3e9ae6629 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=59379) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 583.419335] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "624ec0e2-c230-4469-8ffe-047f914793b1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.419949] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "624ec0e2-c230-4469-8ffe-047f914793b1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.438048] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 583.546236] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.546524] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.548168] env[59379]: INFO nova.compute.claims [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 583.681462] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f9c207b-2f79-4cd9-936e-0695e3cddc7a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.689942] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-067d5485-e1c4-4a82-ad89-2f7f62a15a67 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.727276] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04e3d214-87a3-4981-ab3e-1b43b15c5126 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.736311] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eead6b71-ced2-4e96-82b7-c790f568f9c5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.753172] env[59379]: DEBUG nova.compute.provider_tree [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 583.769486] env[59379]: DEBUG nova.scheduler.client.report [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 583.786986] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.786986] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 583.830460] env[59379]: DEBUG nova.compute.utils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 583.833305] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 583.833543] env[59379]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 583.851670] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 583.950675] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 584.105279] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 584.105530] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 584.106694] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 584.107044] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 584.107172] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 584.107564] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 584.107804] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 584.107959] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 584.108367] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 584.108505] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 584.108683] env[59379]: DEBUG nova.virt.hardware [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 584.111775] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60944804-e8aa-4725-9d77-ff5466e43729 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.122708] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1777c60e-c65f-407b-abf3-9fe77d29e872 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.149096] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7bc666b-96cf-4668-a4ad-b98cdbe35dc0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.244899] env[59379]: DEBUG nova.policy [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf6c7cdc98f9419e938b071cf3d6217a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0c870ef0323546079b6471bb30e9eb36', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 584.267485] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "ccaea0a9-59d6-456a-9885-2b90abf30abb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.267710] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "ccaea0a9-59d6-456a-9885-2b90abf30abb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.301417] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 584.378724] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.378960] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.382460] env[59379]: INFO nova.compute.claims [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 584.510386] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.510659] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.529062] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 584.539126] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c06fd181-6d1a-4696-b5ef-790d2b4dae42 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.546897] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd176fdb-dc48-46d7-9640-ffb7147f6107 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.594486] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e711e43-3fa7-43ac-bf6c-49847c720227 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.608024] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85a85593-6f5e-499e-931e-d536579e7340 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.613634] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.626781] env[59379]: DEBUG nova.compute.provider_tree [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 584.636440] env[59379]: DEBUG nova.scheduler.client.report [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 584.657085] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.657986] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 584.664019] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.050s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.665751] env[59379]: INFO nova.compute.claims [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 584.721452] env[59379]: DEBUG nova.compute.utils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 584.721452] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 584.724909] env[59379]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 584.735674] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 584.836551] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 584.864139] env[59379]: DEBUG nova.policy [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '286d530fe18948658ebd2710a36984d0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '940667196dca494b839e5099008c23db', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 584.868526] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 584.868745] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 584.868883] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 584.869061] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 584.869198] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 584.869488] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 584.869553] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 584.869682] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 584.869863] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 584.870038] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 584.870783] env[59379]: DEBUG nova.virt.hardware [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 584.871443] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1af84594-2c3f-443e-964e-c752015c6804 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.876311] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28c864ea-331f-48a7-9408-24490e639fcd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.889120] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36cef7f8-b67c-4e84-94e6-52300a56e5be {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.907805] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b205f69-a1f6-4311-b86f-2d8db49cc70b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.941242] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d138064-818f-4b38-95f6-972cdb8ae6df {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.949919] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e9a5d3f-b8b1-463e-a2b1-34d45ab93819 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.965241] env[59379]: DEBUG nova.compute.provider_tree [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 584.975286] env[59379]: DEBUG nova.scheduler.client.report [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 585.000650] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.337s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 585.001447] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 585.060983] env[59379]: DEBUG nova.compute.utils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 585.062769] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 585.062769] env[59379]: DEBUG nova.network.neutron [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 585.080904] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 585.171038] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 585.202114] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 585.202223] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 585.202657] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 585.202657] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 585.202657] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 585.202784] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 585.203110] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 585.203186] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 585.206025] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 585.206025] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 585.206025] env[59379]: DEBUG nova.virt.hardware [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 585.206025] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5bc0187-8b6b-446b-bc33-8ee73ead086a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.217680] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a3875ac-f84b-4489-920f-281210402ed7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.365226] env[59379]: DEBUG nova.policy [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7a555b6832df4a0bb32b26622abc2f1a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '51248048a2ed4ee1801cec899ba5301b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 585.416185] env[59379]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Successfully created port: 1ff46260-4a52-41ac-9057-7b7daabaf2bc {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 585.884409] env[59379]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Successfully created port: b84b176c-1034-4a6b-a3b8-2d3a86aa5f85 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 586.717568] env[59379]: DEBUG nova.network.neutron [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Successfully created port: 3a8f691e-810b-46e6-9adb-0a48e8b6d8f2 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 586.927524] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "d74c7de4-5126-483f-9576-89e0007310b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.927890] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "d74c7de4-5126-483f-9576-89e0007310b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.951475] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 587.021172] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.021454] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.023012] env[59379]: INFO nova.compute.claims [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 587.166054] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef933b34-33d0-42e9-86bb-690c28b421fa {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.174391] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0afdca9b-c763-4a18-962a-dd096efb4844 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.210082] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53e391ae-0ac6-4faa-b2cc-e2b52d04c501 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.218494] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdb21699-abc7-42b2-a1f6-a3a0f6cdefa9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.233125] env[59379]: DEBUG nova.compute.provider_tree [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 587.234547] env[59379]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Successfully updated port: 1ff46260-4a52-41ac-9057-7b7daabaf2bc {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 587.244797] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "refresh_cache-624ec0e2-c230-4469-8ffe-047f914793b1" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 587.245123] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquired lock "refresh_cache-624ec0e2-c230-4469-8ffe-047f914793b1" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 587.245123] env[59379]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 587.249486] env[59379]: DEBUG nova.scheduler.client.report [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 587.265655] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.244s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.266324] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 587.311072] env[59379]: DEBUG nova.compute.utils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 587.314844] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 587.314844] env[59379]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 587.328772] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 587.391855] env[59379]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 587.421987] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "50ff2169-9c1f-4f7a-b365-1949dac57f86" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.422677] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "50ff2169-9c1f-4f7a-b365-1949dac57f86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.425774] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 587.434990] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 587.460734] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 587.460734] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 587.460734] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 587.461177] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 587.461177] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 587.461177] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 587.461177] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 587.461177] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 587.461318] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 587.461318] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 587.461318] env[59379]: DEBUG nova.virt.hardware [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 587.462766] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f30132d-16dd-4571-9c1a-4998d3fb60d6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.474311] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e80f2b0-d86a-4e2b-96b6-e298917ef74b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.500234] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.500482] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.501999] env[59379]: INFO nova.compute.claims [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 587.592482] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "294a5f91-9db2-4a43-8230-d3e6906c30f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.592712] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "294a5f91-9db2-4a43-8230-d3e6906c30f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.608121] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 587.669507] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7b608ee-9d2d-4269-9584-17fa65140dc7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.673463] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.679147] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28013d20-b4f0-434b-8ccd-bd1e8ce5bbe9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.711593] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bd13098-0cf3-486d-ab42-c353b99d7443 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.719859] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d501932-a256-4583-98c2-a4d0b1f902c9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.735313] env[59379]: DEBUG nova.compute.provider_tree [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 587.746311] env[59379]: DEBUG nova.scheduler.client.report [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 587.764923] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.764923] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 587.770331] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.096s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.772081] env[59379]: INFO nova.compute.claims [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 587.780475] env[59379]: DEBUG nova.policy [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c6d6b7e3e1ac4eb39d66cc03a1688972', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9257ab9cebe8414fbc0c992b8c344182', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 587.803514] env[59379]: DEBUG nova.compute.utils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 587.804774] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 587.805127] env[59379]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 587.815650] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 587.939958] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 587.970465] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 587.970465] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 587.970465] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 587.970631] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 587.971311] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 587.971311] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 587.971311] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 587.971311] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 587.971541] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 587.971541] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 587.971641] env[59379]: DEBUG nova.virt.hardware [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 587.974188] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c33e925-2d19-4bf8-9bf1-10f6ac7d6115 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.989442] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54388ace-e345-4cf6-ade2-c92c74baa180 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.998592] env[59379]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Successfully updated port: b84b176c-1034-4a6b-a3b8-2d3a86aa5f85 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 588.012574] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "refresh_cache-ccaea0a9-59d6-456a-9885-2b90abf30abb" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 588.012734] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquired lock "refresh_cache-ccaea0a9-59d6-456a-9885-2b90abf30abb" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 588.012835] env[59379]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 588.110809] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6af220d9-7059-49f9-8a4a-5d40339d60c5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.120616] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92bc4aac-8c37-4b87-88e0-93cd1a67d416 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.125874] env[59379]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 588.128876] env[59379]: DEBUG nova.policy [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9f944574bee047198ea5a8f997006b73', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'de7e8d74ff79471c9b29bb62d6ca8f7b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 588.163581] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2f3e2de-0d58-45a8-a5cc-7eab04d477ff {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.171502] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8071947a-3184-4279-a1ab-d251e26dffe6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.189051] env[59379]: DEBUG nova.compute.provider_tree [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 588.195524] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "2545ca35-7a3f-47ed-b0de-e1bb26967379" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.195725] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "2545ca35-7a3f-47ed-b0de-e1bb26967379" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.198048] env[59379]: DEBUG nova.scheduler.client.report [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 588.211302] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.441s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.212352] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 588.215438] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 588.254914] env[59379]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Updating instance_info_cache with network_info: [{"id": "1ff46260-4a52-41ac-9057-7b7daabaf2bc", "address": "fa:16:3e:0c:bb:b8", "network": {"id": "83361975-cdd3-46a3-8792-09097b8b0152", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1585577391-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c870ef0323546079b6471bb30e9eb36", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ff46260-4a", "ovs_interfaceid": "1ff46260-4a52-41ac-9057-7b7daabaf2bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 588.260833] env[59379]: DEBUG nova.compute.utils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 588.262009] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 588.262228] env[59379]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 588.270508] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.270508] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.271523] env[59379]: INFO nova.compute.claims [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 588.279310] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 588.292396] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Releasing lock "refresh_cache-624ec0e2-c230-4469-8ffe-047f914793b1" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 588.292396] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Instance network_info: |[{"id": "1ff46260-4a52-41ac-9057-7b7daabaf2bc", "address": "fa:16:3e:0c:bb:b8", "network": {"id": "83361975-cdd3-46a3-8792-09097b8b0152", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1585577391-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c870ef0323546079b6471bb30e9eb36", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ff46260-4a", "ovs_interfaceid": "1ff46260-4a52-41ac-9057-7b7daabaf2bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 588.292795] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0c:bb:b8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6046aec4-feda-4ef9-bf4a-800de1e0cd3b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1ff46260-4a52-41ac-9057-7b7daabaf2bc', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 588.307286] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 588.307869] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-31435887-cee5-4d25-9442-31fb2a6851af {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.328232] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Created folder: OpenStack in parent group-v4. [ 588.328232] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Creating folder: Project (0c870ef0323546079b6471bb30e9eb36). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 588.329907] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-07baa765-455f-49a5-b353-274d04dbddf7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.343582] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Created folder: Project (0c870ef0323546079b6471bb30e9eb36) in parent group-v140509. [ 588.343582] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Creating folder: Instances. Parent ref: group-v140510. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 588.343582] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-655b3aca-b246-4903-8458-0b0b4ba07451 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.358179] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "19253198-cb6e-4c48-a88b-26780f3606e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.358179] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "19253198-cb6e-4c48-a88b-26780f3606e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.360471] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Created folder: Instances in parent group-v140510. [ 588.360471] env[59379]: DEBUG oslo.service.loopingcall [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 588.361893] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 588.362756] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 588.365051] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-44fe222c-f27f-4a6d-8d5e-7e6bb0ac6098 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.383828] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 588.394348] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 588.394348] env[59379]: value = "task-559506" [ 588.394348] env[59379]: _type = "Task" [ 588.394348] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 588.408298] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559506, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 588.459673] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.504488] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 588.504731] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 588.506407] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 588.506407] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 588.506407] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 588.506407] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 588.506631] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 588.506667] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 588.506805] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 588.506958] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 588.507239] env[59379]: DEBUG nova.virt.hardware [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 588.508653] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdb14e8f-be66-48c9-a85c-55235d23d1db {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.518227] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c82ff398-4a79-4fa1-9ea1-7fb743ca1da3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.551263] env[59379]: DEBUG nova.policy [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2e0c20ce66e045a5bfdffc27e037327e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd239a4f0ed5b48cf9cd9a334de6f189c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 588.554770] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c50c174-7342-45e7-9b08-ec3fb1142d1c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.562086] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd4e16a4-4dad-47c2-b4da-909e4205c12d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.595525] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-466308ed-f896-4386-bdec-9a9543d30a74 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.604587] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-225ac3ce-2e3e-4239-85ed-9509bd52da6f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.619507] env[59379]: DEBUG nova.compute.provider_tree [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 588.628384] env[59379]: DEBUG nova.scheduler.client.report [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 588.650018] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.379s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.650018] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 588.652567] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.193s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.658039] env[59379]: INFO nova.compute.claims [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 588.699611] env[59379]: DEBUG nova.compute.utils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 588.700928] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 588.705157] env[59379]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 588.717379] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 588.807103] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 588.834262] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 588.834262] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 588.834262] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 588.834505] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 588.834505] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 588.834754] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 588.835196] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 588.835828] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 588.836359] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 588.838923] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 588.838923] env[59379]: DEBUG nova.virt.hardware [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 588.838923] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38455eb9-de02-400a-a77c-075853e890a1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.853776] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4d66785-1f3c-492c-8958-fda7e633cc33 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.915665] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559506, 'name': CreateVM_Task, 'duration_secs': 0.322355} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 588.918647] env[59379]: DEBUG nova.policy [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ca0ef629a1964c7692635b0864879b8e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6cc419217094416381972f1ec63d776f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 588.924194] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 588.938603] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54a544d0-f3f9-4b1a-8d4c-992a0303269b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.949579] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15383970-f07a-4de2-b69e-85520b8e1b86 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.954508] env[59379]: DEBUG oslo_vmware.service [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81bc502e-d819-43e9-b710-4c2c85cce1a2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.962831] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 588.963017] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 588.963628] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 588.991320] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cacdf00a-94b3-478d-bf3c-d227db8b823d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.993531] env[59379]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Updating instance_info_cache with network_info: [{"id": "b84b176c-1034-4a6b-a3b8-2d3a86aa5f85", "address": "fa:16:3e:31:4d:7b", "network": {"id": "11356636-2e2e-412c-aa2f-83b090e036a3", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1192875693-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "940667196dca494b839e5099008c23db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c29724c-5452-441a-8060-5bf89d1f5847", "external-id": "nsx-vlan-transportzone-683", "segmentation_id": 683, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb84b176c-10", "ovs_interfaceid": "b84b176c-1034-4a6b-a3b8-2d3a86aa5f85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 588.994043] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d0fe118-428b-4a2c-8460-3b71baa61cc2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.011116] env[59379]: DEBUG oslo_vmware.api [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Waiting for the task: (returnval){ [ 589.011116] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52e1afc2-d0e6-5922-d209-6b4c7c498473" [ 589.011116] env[59379]: _type = "Task" [ 589.011116] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 589.012679] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d02d9f2-4480-4389-a36c-ea81adfc9f52 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.022912] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Releasing lock "refresh_cache-ccaea0a9-59d6-456a-9885-2b90abf30abb" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 589.024140] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Instance network_info: |[{"id": "b84b176c-1034-4a6b-a3b8-2d3a86aa5f85", "address": "fa:16:3e:31:4d:7b", "network": {"id": "11356636-2e2e-412c-aa2f-83b090e036a3", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1192875693-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "940667196dca494b839e5099008c23db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c29724c-5452-441a-8060-5bf89d1f5847", "external-id": "nsx-vlan-transportzone-683", "segmentation_id": 683, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb84b176c-10", "ovs_interfaceid": "b84b176c-1034-4a6b-a3b8-2d3a86aa5f85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 589.024343] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:31:4d:7b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3c29724c-5452-441a-8060-5bf89d1f5847', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b84b176c-1034-4a6b-a3b8-2d3a86aa5f85', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 589.032202] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Creating folder: Project (940667196dca494b839e5099008c23db). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 589.043014] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d643ab87-35d8-4aa5-8576-cccc854dedbc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.044199] env[59379]: DEBUG nova.compute.provider_tree [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 589.055990] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 589.056245] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 589.056462] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 589.056595] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 589.056976] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 589.057225] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0786c2a5-1fc4-4b6b-94ad-74a406cea949 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.062397] env[59379]: DEBUG nova.scheduler.client.report [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 589.074640] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Created folder: Project (940667196dca494b839e5099008c23db) in parent group-v140509. [ 589.074807] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Creating folder: Instances. Parent ref: group-v140513. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 589.076126] env[59379]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Successfully created port: 6ab20f44-59b4-4f00-8db3-c070261d875e {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 589.078704] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cdbb75a7-3bd0-42a3-971a-f962bf0936f2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.083452] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.431s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 589.083960] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 589.087250] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 589.087436] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 589.088967] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af099ece-0552-4c70-953a-51474876b4db {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.101744] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Created folder: Instances in parent group-v140513. [ 589.101744] env[59379]: DEBUG oslo.service.loopingcall [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 589.101744] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-094508d5-2deb-4f2a-bd3f-7ba1694c8506 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.102260] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 589.102530] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-21f9419d-72b3-4578-b0cd-cae47e7de0e4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.121838] env[59379]: DEBUG oslo_vmware.api [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Waiting for the task: (returnval){ [ 589.121838] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52274395-111b-1682-d04e-15177b4e5440" [ 589.121838] env[59379]: _type = "Task" [ 589.121838] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 589.127097] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 589.127097] env[59379]: value = "task-559509" [ 589.127097] env[59379]: _type = "Task" [ 589.127097] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 589.136713] env[59379]: DEBUG oslo_vmware.api [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52274395-111b-1682-d04e-15177b4e5440, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 589.136713] env[59379]: DEBUG nova.compute.utils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 589.139658] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 589.139658] env[59379]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 589.143995] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559509, 'name': CreateVM_Task} progress is 6%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 589.153865] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 589.273159] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 589.421279] env[59379]: DEBUG nova.policy [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e842af4991e744e48fc9432a7e6429ee', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b1e8436dbc5d4f26b38c83626def8b09', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 589.436581] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 589.436729] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 589.436846] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 589.437013] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 589.437159] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 589.437300] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 589.437501] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 589.437657] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 589.437851] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 589.438063] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 589.438237] env[59379]: DEBUG nova.virt.hardware [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 589.439105] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ac498d5-879b-471e-b41c-5faed9ba9dfd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.449151] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9961ccaa-bb0d-44c4-be6b-cadaec4c530c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.640825] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 589.640825] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Creating directory with path [datastore2] vmware_temp/d64fdc84-5241-4003-b9e5-4887c8e01e6f/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 589.640825] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559509, 'name': CreateVM_Task, 'duration_secs': 0.309377} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 589.640825] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1cf0d8d3-0381-4aa9-a886-ed5130b8680d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.642896] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 589.643207] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 589.643355] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 589.643689] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 589.644251] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-41bcb0a2-532d-4ff9-b0f7-7258e48c8a61 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.648561] env[59379]: DEBUG oslo_vmware.api [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Waiting for the task: (returnval){ [ 589.648561] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]527874f6-cbd8-11bd-0484-be6a07f2d43e" [ 589.648561] env[59379]: _type = "Task" [ 589.648561] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 589.667926] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 589.667926] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 589.667926] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 589.668204] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Created directory with path [datastore2] vmware_temp/d64fdc84-5241-4003-b9e5-4887c8e01e6f/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 589.668204] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Fetch image to [datastore2] vmware_temp/d64fdc84-5241-4003-b9e5-4887c8e01e6f/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 589.668302] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/d64fdc84-5241-4003-b9e5-4887c8e01e6f/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 589.669114] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a048fc9f-2cac-44aa-992d-ff60dfbf91db {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.681048] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6fe8c61-f0c2-4a48-9416-f7b47fc96e56 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.689559] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44275108-f30c-4eb8-a075-bace870da7e3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.721110] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0e3b7bf-a00f-4e8e-b76d-06329879dbac {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.727764] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cd8435bc-0024-4c88-b6ec-9373ce29f1c5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.763433] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 589.853239] env[59379]: DEBUG oslo_vmware.rw_handles [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d64fdc84-5241-4003-b9e5-4887c8e01e6f/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 589.910945] env[59379]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Successfully created port: b87075cb-577d-4937-9ab3-a88e6bfd27ea {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 589.915420] env[59379]: DEBUG oslo_vmware.rw_handles [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 589.915420] env[59379]: DEBUG oslo_vmware.rw_handles [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d64fdc84-5241-4003-b9e5-4887c8e01e6f/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 590.215403] env[59379]: DEBUG nova.network.neutron [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Successfully updated port: 3a8f691e-810b-46e6-9adb-0a48e8b6d8f2 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 590.229270] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "refresh_cache-b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 590.229394] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquired lock "refresh_cache-b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 590.229563] env[59379]: DEBUG nova.network.neutron [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 590.363880] env[59379]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Successfully created port: 7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 590.423806] env[59379]: DEBUG nova.network.neutron [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 590.832169] env[59379]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Successfully updated port: 6ab20f44-59b4-4f00-8db3-c070261d875e {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 590.841252] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "refresh_cache-d74c7de4-5126-483f-9576-89e0007310b8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 590.841403] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquired lock "refresh_cache-d74c7de4-5126-483f-9576-89e0007310b8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 590.841541] env[59379]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 590.899960] env[59379]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 591.222950] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.223253] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.236872] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 591.274724] env[59379]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Updating instance_info_cache with network_info: [{"id": "6ab20f44-59b4-4f00-8db3-c070261d875e", "address": "fa:16:3e:cb:35:b2", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6ab20f44-59", "ovs_interfaceid": "6ab20f44-59b4-4f00-8db3-c070261d875e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 591.292645] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Releasing lock "refresh_cache-d74c7de4-5126-483f-9576-89e0007310b8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 591.292727] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Instance network_info: |[{"id": "6ab20f44-59b4-4f00-8db3-c070261d875e", "address": "fa:16:3e:cb:35:b2", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6ab20f44-59", "ovs_interfaceid": "6ab20f44-59b4-4f00-8db3-c070261d875e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 591.293926] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.294159] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.295987] env[59379]: INFO nova.compute.claims [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 591.298348] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cb:35:b2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '778b9a40-d603-4765-ac88-bd6d42c457a2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6ab20f44-59b4-4f00-8db3-c070261d875e', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 591.306605] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Creating folder: Project (9257ab9cebe8414fbc0c992b8c344182). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 591.307326] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-193bc3e4-4723-4322-90f5-3626eeebbfbf {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.318484] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Created folder: Project (9257ab9cebe8414fbc0c992b8c344182) in parent group-v140509. [ 591.318668] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Creating folder: Instances. Parent ref: group-v140516. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 591.318883] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bf7d1b28-592e-4765-869f-d7dc6e6fb4cf {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.329826] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Created folder: Instances in parent group-v140516. [ 591.329826] env[59379]: DEBUG oslo.service.loopingcall [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 591.329826] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 591.329826] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a3af9cc5-27fa-409b-aab4-233efd8204ef {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.353953] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 591.353953] env[59379]: value = "task-559512" [ 591.353953] env[59379]: _type = "Task" [ 591.353953] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 591.361835] env[59379]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Successfully created port: 176ad8f5-a14c-4999-a6d9-7f1884bc95c4 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 591.369384] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559512, 'name': CreateVM_Task} progress is 5%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 591.514192] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f77274a-44eb-4560-9906-8ac9f92bbf2a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.526348] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd91ea9e-0ec5-44b5-b3ff-2e2373b4e2f2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.558382] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbee351f-dcbb-47b9-8d07-0adc737e82de {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.566138] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-594f5b39-e2b5-432d-aaef-52c75e337b37 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.582870] env[59379]: DEBUG nova.compute.provider_tree [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 591.589020] env[59379]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Successfully updated port: b87075cb-577d-4937-9ab3-a88e6bfd27ea {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 591.599143] env[59379]: DEBUG nova.scheduler.client.report [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 591.616418] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "refresh_cache-294a5f91-9db2-4a43-8230-d3e6906c30f0" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 591.616418] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquired lock "refresh_cache-294a5f91-9db2-4a43-8230-d3e6906c30f0" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 591.616418] env[59379]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 591.620798] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.621365] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 591.658610] env[59379]: DEBUG nova.compute.utils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 591.660277] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 591.660572] env[59379]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 591.670345] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 591.684928] env[59379]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 591.736039] env[59379]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Successfully updated port: 7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 591.751351] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "refresh_cache-19253198-cb6e-4c48-a88b-26780f3606e8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 591.753420] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquired lock "refresh_cache-19253198-cb6e-4c48-a88b-26780f3606e8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 591.753420] env[59379]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 591.757757] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 591.793365] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 591.793365] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 591.793365] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 591.793555] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 591.793555] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 591.793633] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 591.793857] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 591.793926] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 591.794339] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 591.794554] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 591.795294] env[59379]: DEBUG nova.virt.hardware [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 591.796524] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d246564e-c726-42a9-9bdd-b0b635927ac4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.806941] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d05c9ca0-6644-426d-8116-c3d1fe412130 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.828108] env[59379]: DEBUG nova.policy [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '49028902d8e54906824c4a42be504e3d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a62f6b4b95a847bc914323ae8eca38fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 591.830831] env[59379]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 591.865595] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559512, 'name': CreateVM_Task, 'duration_secs': 0.302495} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 591.865859] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 591.866428] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 591.866615] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 591.866928] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 591.867126] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f8fea01b-947c-452e-b3e9-06d7e9d2fb48 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.872277] env[59379]: DEBUG oslo_vmware.api [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Waiting for the task: (returnval){ [ 591.872277] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52edb46c-cfbd-7443-d663-fe52b205201b" [ 591.872277] env[59379]: _type = "Task" [ 591.872277] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 591.881609] env[59379]: DEBUG oslo_vmware.api [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52edb46c-cfbd-7443-d663-fe52b205201b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 591.883234] env[59379]: DEBUG nova.network.neutron [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Updating instance_info_cache with network_info: [{"id": "3a8f691e-810b-46e6-9adb-0a48e8b6d8f2", "address": "fa:16:3e:7b:d0:58", "network": {"id": "fd4c2ac4-525a-40e0-b703-86fa9f875b12", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-762383049-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "51248048a2ed4ee1801cec899ba5301b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "88651df2-0506-4f6c-b868-dd30a81f2b1c", "external-id": "nsx-vlan-transportzone-366", "segmentation_id": 366, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a8f691e-81", "ovs_interfaceid": "3a8f691e-810b-46e6-9adb-0a48e8b6d8f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 591.897134] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Releasing lock "refresh_cache-b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 591.897419] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Instance network_info: |[{"id": "3a8f691e-810b-46e6-9adb-0a48e8b6d8f2", "address": "fa:16:3e:7b:d0:58", "network": {"id": "fd4c2ac4-525a-40e0-b703-86fa9f875b12", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-762383049-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "51248048a2ed4ee1801cec899ba5301b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "88651df2-0506-4f6c-b868-dd30a81f2b1c", "external-id": "nsx-vlan-transportzone-366", "segmentation_id": 366, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a8f691e-81", "ovs_interfaceid": "3a8f691e-810b-46e6-9adb-0a48e8b6d8f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 591.897779] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7b:d0:58', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '88651df2-0506-4f6c-b868-dd30a81f2b1c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3a8f691e-810b-46e6-9adb-0a48e8b6d8f2', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 591.908968] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Creating folder: Project (51248048a2ed4ee1801cec899ba5301b). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 591.909545] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-65019558-d53f-4d93-99ef-20bc16175287 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.920787] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Created folder: Project (51248048a2ed4ee1801cec899ba5301b) in parent group-v140509. [ 591.920986] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Creating folder: Instances. Parent ref: group-v140519. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 591.921220] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4ebab82d-de30-4388-8b9b-65f329662772 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.929858] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Created folder: Instances in parent group-v140519. [ 591.930125] env[59379]: DEBUG oslo.service.loopingcall [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 591.930294] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 591.930432] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6b857417-fb05-49f0-b5c0-1db5e5c2e203 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.949643] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 591.949643] env[59379]: value = "task-559515" [ 591.949643] env[59379]: _type = "Task" [ 591.949643] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 591.958126] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559515, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 592.171539] env[59379]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Successfully created port: 61853293-fbbc-4e9f-b66e-7521676b5d2e {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 592.255740] env[59379]: DEBUG nova.compute.manager [req-0de6133e-ed7f-4eba-9e8c-cbd1c9fb90a4 req-ee479494-f90b-4591-aa72-50e996c4ab0c service nova] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Received event network-vif-plugged-1ff46260-4a52-41ac-9057-7b7daabaf2bc {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 592.255740] env[59379]: DEBUG oslo_concurrency.lockutils [req-0de6133e-ed7f-4eba-9e8c-cbd1c9fb90a4 req-ee479494-f90b-4591-aa72-50e996c4ab0c service nova] Acquiring lock "624ec0e2-c230-4469-8ffe-047f914793b1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.255740] env[59379]: DEBUG oslo_concurrency.lockutils [req-0de6133e-ed7f-4eba-9e8c-cbd1c9fb90a4 req-ee479494-f90b-4591-aa72-50e996c4ab0c service nova] Lock "624ec0e2-c230-4469-8ffe-047f914793b1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 592.256119] env[59379]: DEBUG oslo_concurrency.lockutils [req-0de6133e-ed7f-4eba-9e8c-cbd1c9fb90a4 req-ee479494-f90b-4591-aa72-50e996c4ab0c service nova] Lock "624ec0e2-c230-4469-8ffe-047f914793b1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 592.256119] env[59379]: DEBUG nova.compute.manager [req-0de6133e-ed7f-4eba-9e8c-cbd1c9fb90a4 req-ee479494-f90b-4591-aa72-50e996c4ab0c service nova] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] No waiting events found dispatching network-vif-plugged-1ff46260-4a52-41ac-9057-7b7daabaf2bc {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 592.256616] env[59379]: WARNING nova.compute.manager [req-0de6133e-ed7f-4eba-9e8c-cbd1c9fb90a4 req-ee479494-f90b-4591-aa72-50e996c4ab0c service nova] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Received unexpected event network-vif-plugged-1ff46260-4a52-41ac-9057-7b7daabaf2bc for instance with vm_state building and task_state spawning. [ 592.280498] env[59379]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Updating instance_info_cache with network_info: [{"id": "b87075cb-577d-4937-9ab3-a88e6bfd27ea", "address": "fa:16:3e:47:30:9f", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb87075cb-57", "ovs_interfaceid": "b87075cb-577d-4937-9ab3-a88e6bfd27ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 592.297567] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Releasing lock "refresh_cache-294a5f91-9db2-4a43-8230-d3e6906c30f0" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 592.297749] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Instance network_info: |[{"id": "b87075cb-577d-4937-9ab3-a88e6bfd27ea", "address": "fa:16:3e:47:30:9f", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb87075cb-57", "ovs_interfaceid": "b87075cb-577d-4937-9ab3-a88e6bfd27ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 592.298147] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:47:30:9f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '778b9a40-d603-4765-ac88-bd6d42c457a2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b87075cb-577d-4937-9ab3-a88e6bfd27ea', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 592.309648] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Creating folder: Project (d239a4f0ed5b48cf9cd9a334de6f189c). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 592.309648] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5eca897d-44ea-42d6-b1bc-d0876b80bc6b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.322188] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Created folder: Project (d239a4f0ed5b48cf9cd9a334de6f189c) in parent group-v140509. [ 592.322188] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Creating folder: Instances. Parent ref: group-v140522. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 592.322674] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d4d8f140-3fb5-4789-8c03-3e5fa8bceded {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.331115] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Created folder: Instances in parent group-v140522. [ 592.331393] env[59379]: DEBUG oslo.service.loopingcall [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 592.331458] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 592.331694] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dda3c787-4460-491f-83a5-c8cf2d58d88b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.352960] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 592.352960] env[59379]: value = "task-559518" [ 592.352960] env[59379]: _type = "Task" [ 592.352960] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 592.356301] env[59379]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Updating instance_info_cache with network_info: [{"id": "7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb", "address": "fa:16:3e:3e:4b:15", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c97ef6c-1a", "ovs_interfaceid": "7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 592.361398] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559518, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 592.367768] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Releasing lock "refresh_cache-19253198-cb6e-4c48-a88b-26780f3606e8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 592.368087] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Instance network_info: |[{"id": "7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb", "address": "fa:16:3e:3e:4b:15", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c97ef6c-1a", "ovs_interfaceid": "7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 592.368428] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3e:4b:15', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '778b9a40-d603-4765-ac88-bd6d42c457a2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 592.375852] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Creating folder: Project (b1e8436dbc5d4f26b38c83626def8b09). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 592.376810] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-211bc211-67f1-42fc-a683-4d223f690f5d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.391234] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 592.391477] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 592.391677] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 592.391873] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Created folder: Project (b1e8436dbc5d4f26b38c83626def8b09) in parent group-v140509. [ 592.392034] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Creating folder: Instances. Parent ref: group-v140525. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 592.392252] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-60cd34fc-80a5-4100-a0d4-9d323d2f8fbc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.403146] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Created folder: Instances in parent group-v140525. [ 592.403386] env[59379]: DEBUG oslo.service.loopingcall [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 592.403548] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 592.403752] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ea3d993d-78a8-46c1-ad97-99630342592d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.427286] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 592.427286] env[59379]: value = "task-559521" [ 592.427286] env[59379]: _type = "Task" [ 592.427286] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 592.436354] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559521, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 592.460511] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559515, 'name': CreateVM_Task, 'duration_secs': 0.291429} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 592.460649] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 592.461513] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 592.461602] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 592.462733] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 592.462733] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-55579801-75da-44b3-8399-5313c15b15c5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.467425] env[59379]: DEBUG oslo_vmware.api [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Waiting for the task: (returnval){ [ 592.467425] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52d8d0d8-23c9-d922-75d3-8cd3fb2ab7c4" [ 592.467425] env[59379]: _type = "Task" [ 592.467425] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 592.476994] env[59379]: DEBUG oslo_vmware.api [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52d8d0d8-23c9-d922-75d3-8cd3fb2ab7c4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 592.502662] env[59379]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Successfully created port: 8d77c178-471c-445d-920a-ef44f8a1e2ed {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 592.863490] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559518, 'name': CreateVM_Task, 'duration_secs': 0.444033} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 592.863490] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 592.863490] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 592.938616] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559521, 'name': CreateVM_Task, 'duration_secs': 0.3574} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 592.938616] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 592.939204] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 592.983728] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 592.983973] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 592.984258] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 592.984422] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 592.985183] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 592.985418] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-465185a9-9c43-4f43-a142-f69bda8e2d8e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.991581] env[59379]: DEBUG oslo_vmware.api [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for the task: (returnval){ [ 592.991581] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52282512-6b28-035f-6de5-ea7ebc904c7c" [ 592.991581] env[59379]: _type = "Task" [ 592.991581] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 592.999097] env[59379]: DEBUG oslo_vmware.api [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52282512-6b28-035f-6de5-ea7ebc904c7c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 593.105201] env[59379]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Successfully updated port: 8d77c178-471c-445d-920a-ef44f8a1e2ed {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 593.122603] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "refresh_cache-91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 593.122603] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquired lock "refresh_cache-91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 593.122603] env[59379]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 593.168026] env[59379]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 593.333282] env[59379]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Updating instance_info_cache with network_info: [{"id": "8d77c178-471c-445d-920a-ef44f8a1e2ed", "address": "fa:16:3e:43:4a:7f", "network": {"id": "8425a611-200b-4de6-9821-ac5e0ff6dbe3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1567750693-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a62f6b4b95a847bc914323ae8eca38fc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9875d38f-76e2-416c-bfb7-f18a22b0d8ee", "external-id": "nsx-vlan-transportzone-442", "segmentation_id": 442, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8d77c178-47", "ovs_interfaceid": "8d77c178-471c-445d-920a-ef44f8a1e2ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 593.347363] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Releasing lock "refresh_cache-91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 593.347652] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Instance network_info: |[{"id": "8d77c178-471c-445d-920a-ef44f8a1e2ed", "address": "fa:16:3e:43:4a:7f", "network": {"id": "8425a611-200b-4de6-9821-ac5e0ff6dbe3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1567750693-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a62f6b4b95a847bc914323ae8eca38fc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9875d38f-76e2-416c-bfb7-f18a22b0d8ee", "external-id": "nsx-vlan-transportzone-442", "segmentation_id": 442, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8d77c178-47", "ovs_interfaceid": "8d77c178-471c-445d-920a-ef44f8a1e2ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 593.348070] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:43:4a:7f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9875d38f-76e2-416c-bfb7-f18a22b0d8ee', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8d77c178-471c-445d-920a-ef44f8a1e2ed', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 593.355991] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Creating folder: Project (a62f6b4b95a847bc914323ae8eca38fc). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 593.356544] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-758a07c9-2d2a-49b2-bff7-c1d43b756bb8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.366927] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Created folder: Project (a62f6b4b95a847bc914323ae8eca38fc) in parent group-v140509. [ 593.367120] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Creating folder: Instances. Parent ref: group-v140528. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 593.367334] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f7b2ba4a-8b8c-43a4-b0b8-bfbb612b3b6b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.377039] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Created folder: Instances in parent group-v140528. [ 593.377257] env[59379]: DEBUG oslo.service.loopingcall [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 593.377426] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 593.377607] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b36d5799-437f-4bfa-af44-7fc0d18aaddd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.397859] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 593.397859] env[59379]: value = "task-559524" [ 593.397859] env[59379]: _type = "Task" [ 593.397859] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 593.405665] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559524, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 593.502195] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 593.502485] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 593.502702] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 593.502894] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 593.503212] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 593.503440] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aa97c21e-36e0-4cef-8ac8-2740f982df4b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.508919] env[59379]: DEBUG oslo_vmware.api [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Waiting for the task: (returnval){ [ 593.508919] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52ad57fd-51c8-d916-09e5-5cea7415c853" [ 593.508919] env[59379]: _type = "Task" [ 593.508919] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 593.515747] env[59379]: DEBUG oslo_vmware.api [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52ad57fd-51c8-d916-09e5-5cea7415c853, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 593.790564] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 593.821411] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Getting list of instances from cluster (obj){ [ 593.821411] env[59379]: value = "domain-c8" [ 593.821411] env[59379]: _type = "ClusterComputeResource" [ 593.821411] env[59379]: } {{(pid=59379) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 593.823301] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8042cbf5-dac6-46de-a520-38bd4321993b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.843634] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Got total of 7 instances {{(pid=59379) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 593.843793] env[59379]: WARNING nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] While synchronizing instance power states, found 9 instances in the database and 7 instances on the hypervisor. [ 593.844018] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Triggering sync for uuid 624ec0e2-c230-4469-8ffe-047f914793b1 {{(pid=59379) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 593.844229] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Triggering sync for uuid ccaea0a9-59d6-456a-9885-2b90abf30abb {{(pid=59379) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 593.844379] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Triggering sync for uuid b9ffb5d9-8d56-4980-9e78-1e003cd56f7e {{(pid=59379) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 593.844520] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Triggering sync for uuid d74c7de4-5126-483f-9576-89e0007310b8 {{(pid=59379) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 593.844659] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Triggering sync for uuid 50ff2169-9c1f-4f7a-b365-1949dac57f86 {{(pid=59379) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 593.844797] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Triggering sync for uuid 294a5f91-9db2-4a43-8230-d3e6906c30f0 {{(pid=59379) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 593.844934] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Triggering sync for uuid 2545ca35-7a3f-47ed-b0de-e1bb26967379 {{(pid=59379) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 593.845080] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Triggering sync for uuid 19253198-cb6e-4c48-a88b-26780f3606e8 {{(pid=59379) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 593.845218] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Triggering sync for uuid 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd {{(pid=59379) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 593.845514] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "624ec0e2-c230-4469-8ffe-047f914793b1" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.846010] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "ccaea0a9-59d6-456a-9885-2b90abf30abb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.846010] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.846127] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "d74c7de4-5126-483f-9576-89e0007310b8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.846289] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "50ff2169-9c1f-4f7a-b365-1949dac57f86" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.846457] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "294a5f91-9db2-4a43-8230-d3e6906c30f0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.848531] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "2545ca35-7a3f-47ed-b0de-e1bb26967379" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.848531] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "19253198-cb6e-4c48-a88b-26780f3606e8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.848531] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.848531] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 593.848531] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Getting list of instances from cluster (obj){ [ 593.848531] env[59379]: value = "domain-c8" [ 593.848531] env[59379]: _type = "ClusterComputeResource" [ 593.848531] env[59379]: } {{(pid=59379) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 593.848829] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f43ec00-2fce-4e93-afb4-ad74fa264694 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.869403] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Got total of 7 instances {{(pid=59379) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 593.910092] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559524, 'name': CreateVM_Task, 'duration_secs': 0.335362} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 593.910255] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 593.910896] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 594.020552] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 594.020920] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 594.021310] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 594.021602] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 594.021923] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 594.022183] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6d4e6d0d-5af6-490a-95ad-bf7c8d9d8da5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.026855] env[59379]: DEBUG oslo_vmware.api [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Waiting for the task: (returnval){ [ 594.026855] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]521a0c71-335d-63d3-d28b-ddeb2f3f244f" [ 594.026855] env[59379]: _type = "Task" [ 594.026855] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 594.034517] env[59379]: DEBUG oslo_vmware.api [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]521a0c71-335d-63d3-d28b-ddeb2f3f244f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 594.538812] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 594.539153] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 594.539360] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 594.725324] env[59379]: DEBUG nova.compute.manager [req-b4f85f21-64b6-4420-815a-a9e7311e7d55 req-818602a4-5a68-473e-9cb5-1e13f3d617bd service nova] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Received event network-vif-plugged-3a8f691e-810b-46e6-9adb-0a48e8b6d8f2 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 594.725324] env[59379]: DEBUG oslo_concurrency.lockutils [req-b4f85f21-64b6-4420-815a-a9e7311e7d55 req-818602a4-5a68-473e-9cb5-1e13f3d617bd service nova] Acquiring lock "b9ffb5d9-8d56-4980-9e78-1e003cd56f7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.725324] env[59379]: DEBUG oslo_concurrency.lockutils [req-b4f85f21-64b6-4420-815a-a9e7311e7d55 req-818602a4-5a68-473e-9cb5-1e13f3d617bd service nova] Lock "b9ffb5d9-8d56-4980-9e78-1e003cd56f7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.725324] env[59379]: DEBUG oslo_concurrency.lockutils [req-b4f85f21-64b6-4420-815a-a9e7311e7d55 req-818602a4-5a68-473e-9cb5-1e13f3d617bd service nova] Lock "b9ffb5d9-8d56-4980-9e78-1e003cd56f7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.725486] env[59379]: DEBUG nova.compute.manager [req-b4f85f21-64b6-4420-815a-a9e7311e7d55 req-818602a4-5a68-473e-9cb5-1e13f3d617bd service nova] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] No waiting events found dispatching network-vif-plugged-3a8f691e-810b-46e6-9adb-0a48e8b6d8f2 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 594.725613] env[59379]: WARNING nova.compute.manager [req-b4f85f21-64b6-4420-815a-a9e7311e7d55 req-818602a4-5a68-473e-9cb5-1e13f3d617bd service nova] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Received unexpected event network-vif-plugged-3a8f691e-810b-46e6-9adb-0a48e8b6d8f2 for instance with vm_state building and task_state spawning. [ 594.856339] env[59379]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Successfully updated port: 176ad8f5-a14c-4999-a6d9-7f1884bc95c4 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 594.864918] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "refresh_cache-50ff2169-9c1f-4f7a-b365-1949dac57f86" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 594.865114] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquired lock "refresh_cache-50ff2169-9c1f-4f7a-b365-1949dac57f86" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 594.865274] env[59379]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 595.028646] env[59379]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 595.389597] env[59379]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Successfully updated port: 61853293-fbbc-4e9f-b66e-7521676b5d2e {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 595.399013] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "refresh_cache-2545ca35-7a3f-47ed-b0de-e1bb26967379" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 595.399119] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquired lock "refresh_cache-2545ca35-7a3f-47ed-b0de-e1bb26967379" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 595.399359] env[59379]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 595.501479] env[59379]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 595.554000] env[59379]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Updating instance_info_cache with network_info: [{"id": "176ad8f5-a14c-4999-a6d9-7f1884bc95c4", "address": "fa:16:3e:9e:c2:44", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.57", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap176ad8f5-a1", "ovs_interfaceid": "176ad8f5-a14c-4999-a6d9-7f1884bc95c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 595.571449] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Releasing lock "refresh_cache-50ff2169-9c1f-4f7a-b365-1949dac57f86" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 595.571658] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Instance network_info: |[{"id": "176ad8f5-a14c-4999-a6d9-7f1884bc95c4", "address": "fa:16:3e:9e:c2:44", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.57", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap176ad8f5-a1", "ovs_interfaceid": "176ad8f5-a14c-4999-a6d9-7f1884bc95c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 595.572087] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9e:c2:44', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '778b9a40-d603-4765-ac88-bd6d42c457a2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '176ad8f5-a14c-4999-a6d9-7f1884bc95c4', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 595.580928] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Creating folder: Project (de7e8d74ff79471c9b29bb62d6ca8f7b). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 595.581752] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b2a336ca-158b-47dd-8825-6b0dba2be762 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.593904] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Created folder: Project (de7e8d74ff79471c9b29bb62d6ca8f7b) in parent group-v140509. [ 595.594104] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Creating folder: Instances. Parent ref: group-v140531. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 595.594320] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-07b931bd-c8e8-41df-b0ac-a567ec9d634f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.607357] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Created folder: Instances in parent group-v140531. [ 595.607357] env[59379]: DEBUG oslo.service.loopingcall [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 595.607357] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 595.607692] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5cd087f9-cb2c-41c7-b8bc-70ebab158470 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.637247] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 595.637247] env[59379]: value = "task-559527" [ 595.637247] env[59379]: _type = "Task" [ 595.637247] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 595.645502] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559527, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 596.059227] env[59379]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Updating instance_info_cache with network_info: [{"id": "61853293-fbbc-4e9f-b66e-7521676b5d2e", "address": "fa:16:3e:22:ba:7c", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap61853293-fb", "ovs_interfaceid": "61853293-fbbc-4e9f-b66e-7521676b5d2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 596.071100] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Releasing lock "refresh_cache-2545ca35-7a3f-47ed-b0de-e1bb26967379" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 596.071213] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Instance network_info: |[{"id": "61853293-fbbc-4e9f-b66e-7521676b5d2e", "address": "fa:16:3e:22:ba:7c", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap61853293-fb", "ovs_interfaceid": "61853293-fbbc-4e9f-b66e-7521676b5d2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 596.071929] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:22:ba:7c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '778b9a40-d603-4765-ac88-bd6d42c457a2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '61853293-fbbc-4e9f-b66e-7521676b5d2e', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 596.080883] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Creating folder: Project (6cc419217094416381972f1ec63d776f). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 596.081294] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-96d91213-a7bd-41d0-80bc-8bda93d3a396 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.093558] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Created folder: Project (6cc419217094416381972f1ec63d776f) in parent group-v140509. [ 596.093558] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Creating folder: Instances. Parent ref: group-v140534. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 596.093740] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-03adff16-efc6-4b0f-9bff-f15143fbdfd4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.103325] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Created folder: Instances in parent group-v140534. [ 596.103804] env[59379]: DEBUG oslo.service.loopingcall [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 596.103804] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 596.103804] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cdf56244-310d-4f38-abe3-e3e341a311fe {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.125377] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 596.125377] env[59379]: value = "task-559530" [ 596.125377] env[59379]: _type = "Task" [ 596.125377] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 596.135663] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559530, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 596.145935] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559527, 'name': CreateVM_Task, 'duration_secs': 0.305907} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 596.146104] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 596.146767] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 596.146910] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 596.147230] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 596.147463] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2451595f-1ec7-4b9c-b3c6-031f50cd18bc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.152940] env[59379]: DEBUG oslo_vmware.api [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Waiting for the task: (returnval){ [ 596.152940] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52b01ef4-157f-dd92-ef30-1166c2ad2743" [ 596.152940] env[59379]: _type = "Task" [ 596.152940] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 596.165032] env[59379]: DEBUG oslo_vmware.api [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52b01ef4-157f-dd92-ef30-1166c2ad2743, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 596.396185] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Received event network-vif-plugged-b84b176c-1034-4a6b-a3b8-2d3a86aa5f85 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 596.396395] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "ccaea0a9-59d6-456a-9885-2b90abf30abb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.396588] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Lock "ccaea0a9-59d6-456a-9885-2b90abf30abb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.396740] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Lock "ccaea0a9-59d6-456a-9885-2b90abf30abb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.396895] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] No waiting events found dispatching network-vif-plugged-b84b176c-1034-4a6b-a3b8-2d3a86aa5f85 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 596.397060] env[59379]: WARNING nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Received unexpected event network-vif-plugged-b84b176c-1034-4a6b-a3b8-2d3a86aa5f85 for instance with vm_state building and task_state spawning. [ 596.397217] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Received event network-changed-1ff46260-4a52-41ac-9057-7b7daabaf2bc {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 596.397357] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Refreshing instance network info cache due to event network-changed-1ff46260-4a52-41ac-9057-7b7daabaf2bc. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 596.397522] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "refresh_cache-624ec0e2-c230-4469-8ffe-047f914793b1" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 596.397647] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquired lock "refresh_cache-624ec0e2-c230-4469-8ffe-047f914793b1" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 596.397868] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Refreshing network info cache for port 1ff46260-4a52-41ac-9057-7b7daabaf2bc {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 596.641666] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559530, 'name': CreateVM_Task, 'duration_secs': 0.331778} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 596.641917] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 596.642446] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 596.662913] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 596.667334] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 596.667334] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 596.667334] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 596.667334] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 596.667508] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-edd16ecf-ca1a-489f-abb8-36e559c78cfd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.672491] env[59379]: DEBUG oslo_vmware.api [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Waiting for the task: (returnval){ [ 596.672491] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52758328-9d6d-2829-90ba-5bda4f3622dd" [ 596.672491] env[59379]: _type = "Task" [ 596.672491] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 596.682427] env[59379]: DEBUG oslo_vmware.api [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52758328-9d6d-2829-90ba-5bda4f3622dd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 597.045624] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Updated VIF entry in instance network info cache for port 1ff46260-4a52-41ac-9057-7b7daabaf2bc. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 597.047055] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Updating instance_info_cache with network_info: [{"id": "1ff46260-4a52-41ac-9057-7b7daabaf2bc", "address": "fa:16:3e:0c:bb:b8", "network": {"id": "83361975-cdd3-46a3-8792-09097b8b0152", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1585577391-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0c870ef0323546079b6471bb30e9eb36", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6046aec4-feda-4ef9-bf4a-800de1e0cd3b", "external-id": "nsx-vlan-transportzone-903", "segmentation_id": 903, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ff46260-4a", "ovs_interfaceid": "1ff46260-4a52-41ac-9057-7b7daabaf2bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.058655] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Releasing lock "refresh_cache-624ec0e2-c230-4469-8ffe-047f914793b1" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 597.059147] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Received event network-changed-b84b176c-1034-4a6b-a3b8-2d3a86aa5f85 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 597.059147] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Refreshing instance network info cache due to event network-changed-b84b176c-1034-4a6b-a3b8-2d3a86aa5f85. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 597.059302] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "refresh_cache-ccaea0a9-59d6-456a-9885-2b90abf30abb" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 597.059591] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquired lock "refresh_cache-ccaea0a9-59d6-456a-9885-2b90abf30abb" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 597.059591] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Refreshing network info cache for port b84b176c-1034-4a6b-a3b8-2d3a86aa5f85 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 597.189160] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 597.189676] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 597.189935] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 597.477497] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 597.478069] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 597.479664] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 597.479664] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 597.506435] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 597.506605] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 597.506740] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 597.506866] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 597.506991] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 597.507121] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 597.507284] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 597.507378] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 597.507574] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 597.507716] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Didn't find any instances for network info cache update. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 597.508594] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 597.508594] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 597.508785] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 597.508828] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 597.510560] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 597.510560] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 597.510560] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 597.510560] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 597.529846] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 597.530338] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 597.530338] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 597.530425] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 597.533200] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c619aa41-2d01-4fe8-a001-04d9670bda2e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.545053] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ce2fe4b-0d93-4695-be70-e0420ed42fe3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.565287] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d662c266-4b27-46ab-9fcb-7540d5a7d7b3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.577356] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8184be8e-6661-4169-8bcc-6bbd4790477b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.615153] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181766MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 597.615362] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 597.615682] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 597.708126] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 624ec0e2-c230-4469-8ffe-047f914793b1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 597.708387] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance ccaea0a9-59d6-456a-9885-2b90abf30abb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 597.708447] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance b9ffb5d9-8d56-4980-9e78-1e003cd56f7e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 597.708566] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance d74c7de4-5126-483f-9576-89e0007310b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 597.708683] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 597.708796] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 294a5f91-9db2-4a43-8230-d3e6906c30f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 597.708909] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 597.709030] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 19253198-cb6e-4c48-a88b-26780f3606e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 597.709147] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 597.709343] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 597.709484] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 597.932812] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f359f1ec-4524-462f-bca1-cef2a177af1e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.945703] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de99d91c-ae59-4622-86e3-15e0b766eb27 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.985369] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f3125df-bf43-45b6-b950-fe9da7fce10b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.996874] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b48467b-01c0-4398-956d-170e61d1cdd9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.010234] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 598.028688] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 598.049347] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 598.049551] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.434s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 598.055036] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Updated VIF entry in instance network info cache for port b84b176c-1034-4a6b-a3b8-2d3a86aa5f85. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 598.055581] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Updating instance_info_cache with network_info: [{"id": "b84b176c-1034-4a6b-a3b8-2d3a86aa5f85", "address": "fa:16:3e:31:4d:7b", "network": {"id": "11356636-2e2e-412c-aa2f-83b090e036a3", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1192875693-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "940667196dca494b839e5099008c23db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c29724c-5452-441a-8060-5bf89d1f5847", "external-id": "nsx-vlan-transportzone-683", "segmentation_id": 683, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb84b176c-10", "ovs_interfaceid": "b84b176c-1034-4a6b-a3b8-2d3a86aa5f85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 598.066746] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Releasing lock "refresh_cache-ccaea0a9-59d6-456a-9885-2b90abf30abb" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 598.067082] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Received event network-vif-plugged-6ab20f44-59b4-4f00-8db3-c070261d875e {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 598.067267] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "d74c7de4-5126-483f-9576-89e0007310b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.067562] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Lock "d74c7de4-5126-483f-9576-89e0007310b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.067600] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Lock "d74c7de4-5126-483f-9576-89e0007310b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 598.067756] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: d74c7de4-5126-483f-9576-89e0007310b8] No waiting events found dispatching network-vif-plugged-6ab20f44-59b4-4f00-8db3-c070261d875e {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 598.067916] env[59379]: WARNING nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Received unexpected event network-vif-plugged-6ab20f44-59b4-4f00-8db3-c070261d875e for instance with vm_state building and task_state spawning. [ 598.068450] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Received event network-changed-6ab20f44-59b4-4f00-8db3-c070261d875e {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 598.068894] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Refreshing instance network info cache due to event network-changed-6ab20f44-59b4-4f00-8db3-c070261d875e. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 598.069125] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "refresh_cache-d74c7de4-5126-483f-9576-89e0007310b8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 598.069265] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquired lock "refresh_cache-d74c7de4-5126-483f-9576-89e0007310b8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 598.069423] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Refreshing network info cache for port 6ab20f44-59b4-4f00-8db3-c070261d875e {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 598.696985] env[59379]: DEBUG nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Received event network-changed-3a8f691e-810b-46e6-9adb-0a48e8b6d8f2 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 598.699717] env[59379]: DEBUG nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Refreshing instance network info cache due to event network-changed-3a8f691e-810b-46e6-9adb-0a48e8b6d8f2. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 598.700069] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Acquiring lock "refresh_cache-b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 598.700270] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Acquired lock "refresh_cache-b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 598.700464] env[59379]: DEBUG nova.network.neutron [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Refreshing network info cache for port 3a8f691e-810b-46e6-9adb-0a48e8b6d8f2 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 599.064472] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Updated VIF entry in instance network info cache for port 6ab20f44-59b4-4f00-8db3-c070261d875e. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 599.067285] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Updating instance_info_cache with network_info: [{"id": "6ab20f44-59b4-4f00-8db3-c070261d875e", "address": "fa:16:3e:cb:35:b2", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.40", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6ab20f44-59", "ovs_interfaceid": "6ab20f44-59b4-4f00-8db3-c070261d875e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 599.080727] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Releasing lock "refresh_cache-d74c7de4-5126-483f-9576-89e0007310b8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 599.081087] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Received event network-vif-plugged-b87075cb-577d-4937-9ab3-a88e6bfd27ea {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 599.081248] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "294a5f91-9db2-4a43-8230-d3e6906c30f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 599.081442] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Lock "294a5f91-9db2-4a43-8230-d3e6906c30f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 599.081602] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Lock "294a5f91-9db2-4a43-8230-d3e6906c30f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 599.081796] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] No waiting events found dispatching network-vif-plugged-b87075cb-577d-4937-9ab3-a88e6bfd27ea {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 599.081933] env[59379]: WARNING nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Received unexpected event network-vif-plugged-b87075cb-577d-4937-9ab3-a88e6bfd27ea for instance with vm_state building and task_state spawning. [ 599.082277] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Received event network-vif-plugged-7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 599.082277] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "19253198-cb6e-4c48-a88b-26780f3606e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 599.082507] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Lock "19253198-cb6e-4c48-a88b-26780f3606e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 599.082568] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Lock "19253198-cb6e-4c48-a88b-26780f3606e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 599.082851] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] No waiting events found dispatching network-vif-plugged-7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 599.082851] env[59379]: WARNING nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Received unexpected event network-vif-plugged-7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb for instance with vm_state building and task_state spawning. [ 599.083030] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Received event network-changed-b87075cb-577d-4937-9ab3-a88e6bfd27ea {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 599.084139] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Refreshing instance network info cache due to event network-changed-b87075cb-577d-4937-9ab3-a88e6bfd27ea. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 599.084521] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "refresh_cache-294a5f91-9db2-4a43-8230-d3e6906c30f0" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 599.084521] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquired lock "refresh_cache-294a5f91-9db2-4a43-8230-d3e6906c30f0" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 599.084659] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Refreshing network info cache for port b87075cb-577d-4937-9ab3-a88e6bfd27ea {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 599.475809] env[59379]: DEBUG nova.network.neutron [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Updated VIF entry in instance network info cache for port 3a8f691e-810b-46e6-9adb-0a48e8b6d8f2. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 599.475809] env[59379]: DEBUG nova.network.neutron [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Updating instance_info_cache with network_info: [{"id": "3a8f691e-810b-46e6-9adb-0a48e8b6d8f2", "address": "fa:16:3e:7b:d0:58", "network": {"id": "fd4c2ac4-525a-40e0-b703-86fa9f875b12", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-762383049-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "51248048a2ed4ee1801cec899ba5301b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "88651df2-0506-4f6c-b868-dd30a81f2b1c", "external-id": "nsx-vlan-transportzone-366", "segmentation_id": 366, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a8f691e-81", "ovs_interfaceid": "3a8f691e-810b-46e6-9adb-0a48e8b6d8f2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 599.486716] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Releasing lock "refresh_cache-b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 599.486716] env[59379]: DEBUG nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Received event network-vif-plugged-176ad8f5-a14c-4999-a6d9-7f1884bc95c4 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 599.486716] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Acquiring lock "50ff2169-9c1f-4f7a-b365-1949dac57f86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 599.486716] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Lock "50ff2169-9c1f-4f7a-b365-1949dac57f86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 599.486923] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Lock "50ff2169-9c1f-4f7a-b365-1949dac57f86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 599.486923] env[59379]: DEBUG nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] No waiting events found dispatching network-vif-plugged-176ad8f5-a14c-4999-a6d9-7f1884bc95c4 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 599.486923] env[59379]: WARNING nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Received unexpected event network-vif-plugged-176ad8f5-a14c-4999-a6d9-7f1884bc95c4 for instance with vm_state building and task_state spawning. [ 599.486923] env[59379]: DEBUG nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Received event network-changed-176ad8f5-a14c-4999-a6d9-7f1884bc95c4 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 599.487069] env[59379]: DEBUG nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Refreshing instance network info cache due to event network-changed-176ad8f5-a14c-4999-a6d9-7f1884bc95c4. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 599.487069] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Acquiring lock "refresh_cache-50ff2169-9c1f-4f7a-b365-1949dac57f86" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 599.487069] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Acquired lock "refresh_cache-50ff2169-9c1f-4f7a-b365-1949dac57f86" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 599.487212] env[59379]: DEBUG nova.network.neutron [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Refreshing network info cache for port 176ad8f5-a14c-4999-a6d9-7f1884bc95c4 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 599.804818] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Updated VIF entry in instance network info cache for port b87075cb-577d-4937-9ab3-a88e6bfd27ea. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 599.805172] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Updating instance_info_cache with network_info: [{"id": "b87075cb-577d-4937-9ab3-a88e6bfd27ea", "address": "fa:16:3e:47:30:9f", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb87075cb-57", "ovs_interfaceid": "b87075cb-577d-4937-9ab3-a88e6bfd27ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 599.815324] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Releasing lock "refresh_cache-294a5f91-9db2-4a43-8230-d3e6906c30f0" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 599.815651] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Received event network-changed-7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 599.815797] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Refreshing instance network info cache due to event network-changed-7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 599.815992] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "refresh_cache-19253198-cb6e-4c48-a88b-26780f3606e8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 599.816145] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquired lock "refresh_cache-19253198-cb6e-4c48-a88b-26780f3606e8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 599.816300] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Refreshing network info cache for port 7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 600.194182] env[59379]: DEBUG nova.network.neutron [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Updated VIF entry in instance network info cache for port 176ad8f5-a14c-4999-a6d9-7f1884bc95c4. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 600.194975] env[59379]: DEBUG nova.network.neutron [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Updating instance_info_cache with network_info: [{"id": "176ad8f5-a14c-4999-a6d9-7f1884bc95c4", "address": "fa:16:3e:9e:c2:44", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.57", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap176ad8f5-a1", "ovs_interfaceid": "176ad8f5-a14c-4999-a6d9-7f1884bc95c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 600.207759] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Releasing lock "refresh_cache-50ff2169-9c1f-4f7a-b365-1949dac57f86" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 600.207759] env[59379]: DEBUG nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Received event network-vif-plugged-61853293-fbbc-4e9f-b66e-7521676b5d2e {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 600.207759] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Acquiring lock "2545ca35-7a3f-47ed-b0de-e1bb26967379-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.207759] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Lock "2545ca35-7a3f-47ed-b0de-e1bb26967379-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.207936] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Lock "2545ca35-7a3f-47ed-b0de-e1bb26967379-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.207936] env[59379]: DEBUG nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] No waiting events found dispatching network-vif-plugged-61853293-fbbc-4e9f-b66e-7521676b5d2e {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 600.207936] env[59379]: WARNING nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Received unexpected event network-vif-plugged-61853293-fbbc-4e9f-b66e-7521676b5d2e for instance with vm_state building and task_state spawning. [ 600.207936] env[59379]: DEBUG nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Received event network-changed-61853293-fbbc-4e9f-b66e-7521676b5d2e {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 600.208067] env[59379]: DEBUG nova.compute.manager [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Refreshing instance network info cache due to event network-changed-61853293-fbbc-4e9f-b66e-7521676b5d2e. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 600.208067] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Acquiring lock "refresh_cache-2545ca35-7a3f-47ed-b0de-e1bb26967379" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 600.208067] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Acquired lock "refresh_cache-2545ca35-7a3f-47ed-b0de-e1bb26967379" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 600.208152] env[59379]: DEBUG nova.network.neutron [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Refreshing network info cache for port 61853293-fbbc-4e9f-b66e-7521676b5d2e {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 600.428400] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Updated VIF entry in instance network info cache for port 7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 600.428400] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Updating instance_info_cache with network_info: [{"id": "7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb", "address": "fa:16:3e:3e:4b:15", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c97ef6c-1a", "ovs_interfaceid": "7c97ef6c-1a1c-4f58-aeb3-09a46550f9eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 600.436987] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Releasing lock "refresh_cache-19253198-cb6e-4c48-a88b-26780f3606e8" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 600.437305] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Received event network-vif-plugged-8d77c178-471c-445d-920a-ef44f8a1e2ed {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 600.437525] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.437796] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.437977] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.438200] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] No waiting events found dispatching network-vif-plugged-8d77c178-471c-445d-920a-ef44f8a1e2ed {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 600.438457] env[59379]: WARNING nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Received unexpected event network-vif-plugged-8d77c178-471c-445d-920a-ef44f8a1e2ed for instance with vm_state building and task_state spawning. [ 600.438591] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Received event network-changed-8d77c178-471c-445d-920a-ef44f8a1e2ed {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 600.438776] env[59379]: DEBUG nova.compute.manager [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Refreshing instance network info cache due to event network-changed-8d77c178-471c-445d-920a-ef44f8a1e2ed. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 600.438991] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquiring lock "refresh_cache-91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 600.439232] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Acquired lock "refresh_cache-91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 600.439356] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Refreshing network info cache for port 8d77c178-471c-445d-920a-ef44f8a1e2ed {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 601.047864] env[59379]: DEBUG nova.network.neutron [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Updated VIF entry in instance network info cache for port 61853293-fbbc-4e9f-b66e-7521676b5d2e. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 601.048232] env[59379]: DEBUG nova.network.neutron [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Updating instance_info_cache with network_info: [{"id": "61853293-fbbc-4e9f-b66e-7521676b5d2e", "address": "fa:16:3e:22:ba:7c", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.193", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap61853293-fb", "ovs_interfaceid": "61853293-fbbc-4e9f-b66e-7521676b5d2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.064867] env[59379]: DEBUG oslo_concurrency.lockutils [req-2ccfa729-2ff8-40b0-891d-294dab40db61 req-50201f41-40e0-44fb-976c-e0786b0b8282 service nova] Releasing lock "refresh_cache-2545ca35-7a3f-47ed-b0de-e1bb26967379" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 601.248430] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Updated VIF entry in instance network info cache for port 8d77c178-471c-445d-920a-ef44f8a1e2ed. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 601.248811] env[59379]: DEBUG nova.network.neutron [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Updating instance_info_cache with network_info: [{"id": "8d77c178-471c-445d-920a-ef44f8a1e2ed", "address": "fa:16:3e:43:4a:7f", "network": {"id": "8425a611-200b-4de6-9821-ac5e0ff6dbe3", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-1567750693-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a62f6b4b95a847bc914323ae8eca38fc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9875d38f-76e2-416c-bfb7-f18a22b0d8ee", "external-id": "nsx-vlan-transportzone-442", "segmentation_id": 442, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8d77c178-47", "ovs_interfaceid": "8d77c178-471c-445d-920a-ef44f8a1e2ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.268734] env[59379]: DEBUG oslo_concurrency.lockutils [req-5b58cace-3c90-4616-9e38-73a6b4f8cd1f req-b8c8a7de-f09c-4d8b-97e7-a3b6d3c3479e service nova] Releasing lock "refresh_cache-91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 602.284801] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "71554abb-780c-4681-909f-8ff93712c82e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.284801] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "71554abb-780c-4681-909f-8ff93712c82e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.300381] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 602.360076] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.360665] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.362853] env[59379]: INFO nova.compute.claims [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 602.604413] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba02441b-7516-4ad9-9b10-12969500b829 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.615711] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddac27b9-d16f-4b42-97f6-eff0d0fcf0fc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.650996] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de22527d-ab63-44b5-b31f-2322e443fd74 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.658792] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ffa1c9b-688c-4996-9999-0c4ce75dedeb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.674716] env[59379]: DEBUG nova.compute.provider_tree [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 602.682968] env[59379]: DEBUG nova.scheduler.client.report [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 602.706092] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.706092] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 602.750401] env[59379]: DEBUG nova.compute.utils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 602.751897] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 602.752093] env[59379]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 602.769666] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 602.861140] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 602.889930] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 602.890188] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 602.890338] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 602.890514] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 602.890652] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 602.890902] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 602.891069] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 602.891227] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 602.891396] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 602.891551] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 602.891715] env[59379]: DEBUG nova.virt.hardware [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 602.892606] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d65916c9-5491-4e72-a2e4-977ebc30bf59 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.905838] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa997d62-d544-4294-add1-d0d5e62b8340 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.932221] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "789a3358-bc70-44a5-bb2f-4fc2f1ff9116" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.932441] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "789a3358-bc70-44a5-bb2f-4fc2f1ff9116" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.096175] env[59379]: DEBUG nova.policy [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cf57234027f34707a730c895bcac8ccd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b54ecd325ea04fb58510dbc4b236d0e3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 604.029173] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "14e395c0-3650-40d6-82f1-1bd8f0b29984" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 604.029434] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "14e395c0-3650-40d6-82f1-1bd8f0b29984" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 604.108230] env[59379]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Successfully created port: 48410f53-2221-4bc5-8b42-c47079174d35 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 605.947150] env[59379]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Successfully updated port: 48410f53-2221-4bc5-8b42-c47079174d35 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 605.955666] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "refresh_cache-71554abb-780c-4681-909f-8ff93712c82e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.955810] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquired lock "refresh_cache-71554abb-780c-4681-909f-8ff93712c82e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.955956] env[59379]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 606.042422] env[59379]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 606.451277] env[59379]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Updating instance_info_cache with network_info: [{"id": "48410f53-2221-4bc5-8b42-c47079174d35", "address": "fa:16:3e:c7:c8:3b", "network": {"id": "5df54326-cfb1-473c-97d4-e34c87e07288", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-576478520-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b54ecd325ea04fb58510dbc4b236d0e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19671de9-8b5b-4710-adc3-7419f3c0f171", "external-id": "nsx-vlan-transportzone-421", "segmentation_id": 421, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap48410f53-22", "ovs_interfaceid": "48410f53-2221-4bc5-8b42-c47079174d35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.464843] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Releasing lock "refresh_cache-71554abb-780c-4681-909f-8ff93712c82e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.465257] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Instance network_info: |[{"id": "48410f53-2221-4bc5-8b42-c47079174d35", "address": "fa:16:3e:c7:c8:3b", "network": {"id": "5df54326-cfb1-473c-97d4-e34c87e07288", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-576478520-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b54ecd325ea04fb58510dbc4b236d0e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19671de9-8b5b-4710-adc3-7419f3c0f171", "external-id": "nsx-vlan-transportzone-421", "segmentation_id": 421, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap48410f53-22", "ovs_interfaceid": "48410f53-2221-4bc5-8b42-c47079174d35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 606.465548] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c7:c8:3b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '19671de9-8b5b-4710-adc3-7419f3c0f171', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '48410f53-2221-4bc5-8b42-c47079174d35', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 606.473262] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Creating folder: Project (b54ecd325ea04fb58510dbc4b236d0e3). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.473804] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a79758e5-b8eb-47a6-aee4-42659cd39064 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.484920] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Created folder: Project (b54ecd325ea04fb58510dbc4b236d0e3) in parent group-v140509. [ 606.485124] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Creating folder: Instances. Parent ref: group-v140537. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.485633] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ae3a04c8-b6a2-487b-9bd8-c189b6f6b795 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.498833] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Created folder: Instances in parent group-v140537. [ 606.500088] env[59379]: DEBUG oslo.service.loopingcall [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 606.500088] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 606.500088] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ecb40ae9-8ead-4c17-89c1-e9db8de64075 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.521258] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 606.521258] env[59379]: value = "task-559533" [ 606.521258] env[59379]: _type = "Task" [ 606.521258] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.534096] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559533, 'name': CreateVM_Task} progress is 6%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 607.032290] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559533, 'name': CreateVM_Task, 'duration_secs': 0.307742} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 607.032537] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 607.033108] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.033255] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.033563] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 607.033795] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b98ae73a-9e47-4152-bb94-4ce86b52488d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.038851] env[59379]: DEBUG oslo_vmware.api [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Waiting for the task: (returnval){ [ 607.038851] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52b0f0d9-e194-5c7d-133d-935263ffad47" [ 607.038851] env[59379]: _type = "Task" [ 607.038851] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 607.047834] env[59379]: DEBUG oslo_vmware.api [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52b0f0d9-e194-5c7d-133d-935263ffad47, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 607.270159] env[59379]: DEBUG nova.compute.manager [req-4aec2e63-afcf-45a9-b2a0-02b8ee3441e5 req-c17560a0-4e11-47ec-a41c-2a0ede204897 service nova] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Received event network-vif-plugged-48410f53-2221-4bc5-8b42-c47079174d35 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 607.270159] env[59379]: DEBUG oslo_concurrency.lockutils [req-4aec2e63-afcf-45a9-b2a0-02b8ee3441e5 req-c17560a0-4e11-47ec-a41c-2a0ede204897 service nova] Acquiring lock "71554abb-780c-4681-909f-8ff93712c82e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.270159] env[59379]: DEBUG oslo_concurrency.lockutils [req-4aec2e63-afcf-45a9-b2a0-02b8ee3441e5 req-c17560a0-4e11-47ec-a41c-2a0ede204897 service nova] Lock "71554abb-780c-4681-909f-8ff93712c82e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.270159] env[59379]: DEBUG oslo_concurrency.lockutils [req-4aec2e63-afcf-45a9-b2a0-02b8ee3441e5 req-c17560a0-4e11-47ec-a41c-2a0ede204897 service nova] Lock "71554abb-780c-4681-909f-8ff93712c82e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.272719] env[59379]: DEBUG nova.compute.manager [req-4aec2e63-afcf-45a9-b2a0-02b8ee3441e5 req-c17560a0-4e11-47ec-a41c-2a0ede204897 service nova] [instance: 71554abb-780c-4681-909f-8ff93712c82e] No waiting events found dispatching network-vif-plugged-48410f53-2221-4bc5-8b42-c47079174d35 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 607.272719] env[59379]: WARNING nova.compute.manager [req-4aec2e63-afcf-45a9-b2a0-02b8ee3441e5 req-c17560a0-4e11-47ec-a41c-2a0ede204897 service nova] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Received unexpected event network-vif-plugged-48410f53-2221-4bc5-8b42-c47079174d35 for instance with vm_state building and task_state spawning. [ 607.555410] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 607.555660] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 607.555874] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 611.102251] env[59379]: DEBUG nova.compute.manager [req-f1e4bcae-6de0-406c-b673-aaec1c346060 req-36e165de-a7c2-4c27-aebc-ba8379dd2a40 service nova] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Received event network-changed-48410f53-2221-4bc5-8b42-c47079174d35 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 611.102523] env[59379]: DEBUG nova.compute.manager [req-f1e4bcae-6de0-406c-b673-aaec1c346060 req-36e165de-a7c2-4c27-aebc-ba8379dd2a40 service nova] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Refreshing instance network info cache due to event network-changed-48410f53-2221-4bc5-8b42-c47079174d35. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 611.102523] env[59379]: DEBUG oslo_concurrency.lockutils [req-f1e4bcae-6de0-406c-b673-aaec1c346060 req-36e165de-a7c2-4c27-aebc-ba8379dd2a40 service nova] Acquiring lock "refresh_cache-71554abb-780c-4681-909f-8ff93712c82e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 611.102523] env[59379]: DEBUG oslo_concurrency.lockutils [req-f1e4bcae-6de0-406c-b673-aaec1c346060 req-36e165de-a7c2-4c27-aebc-ba8379dd2a40 service nova] Acquired lock "refresh_cache-71554abb-780c-4681-909f-8ff93712c82e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 611.103087] env[59379]: DEBUG nova.network.neutron [req-f1e4bcae-6de0-406c-b673-aaec1c346060 req-36e165de-a7c2-4c27-aebc-ba8379dd2a40 service nova] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Refreshing network info cache for port 48410f53-2221-4bc5-8b42-c47079174d35 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 612.258739] env[59379]: DEBUG nova.network.neutron [req-f1e4bcae-6de0-406c-b673-aaec1c346060 req-36e165de-a7c2-4c27-aebc-ba8379dd2a40 service nova] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Updated VIF entry in instance network info cache for port 48410f53-2221-4bc5-8b42-c47079174d35. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 612.258739] env[59379]: DEBUG nova.network.neutron [req-f1e4bcae-6de0-406c-b673-aaec1c346060 req-36e165de-a7c2-4c27-aebc-ba8379dd2a40 service nova] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Updating instance_info_cache with network_info: [{"id": "48410f53-2221-4bc5-8b42-c47079174d35", "address": "fa:16:3e:c7:c8:3b", "network": {"id": "5df54326-cfb1-473c-97d4-e34c87e07288", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-576478520-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b54ecd325ea04fb58510dbc4b236d0e3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "19671de9-8b5b-4710-adc3-7419f3c0f171", "external-id": "nsx-vlan-transportzone-421", "segmentation_id": 421, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap48410f53-22", "ovs_interfaceid": "48410f53-2221-4bc5-8b42-c47079174d35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 612.269918] env[59379]: DEBUG oslo_concurrency.lockutils [req-f1e4bcae-6de0-406c-b673-aaec1c346060 req-36e165de-a7c2-4c27-aebc-ba8379dd2a40 service nova] Releasing lock "refresh_cache-71554abb-780c-4681-909f-8ff93712c82e" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 638.724972] env[59379]: WARNING oslo_vmware.rw_handles [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 638.724972] env[59379]: ERROR oslo_vmware.rw_handles [ 638.726458] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/d64fdc84-5241-4003-b9e5-4887c8e01e6f/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 638.730018] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 638.730018] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Copying Virtual Disk [datastore2] vmware_temp/d64fdc84-5241-4003-b9e5-4887c8e01e6f/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/d64fdc84-5241-4003-b9e5-4887c8e01e6f/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 638.730018] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-72b4e1e3-30b0-4a21-9037-6ae826d26e64 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 638.736324] env[59379]: DEBUG oslo_vmware.api [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Waiting for the task: (returnval){ [ 638.736324] env[59379]: value = "task-559534" [ 638.736324] env[59379]: _type = "Task" [ 638.736324] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 638.746533] env[59379]: DEBUG oslo_vmware.api [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Task: {'id': task-559534, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 639.246473] env[59379]: DEBUG oslo_vmware.exceptions [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 639.247257] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 639.249925] env[59379]: ERROR nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 639.249925] env[59379]: Faults: ['InvalidArgument'] [ 639.249925] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Traceback (most recent call last): [ 639.249925] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 639.249925] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] yield resources [ 639.249925] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 639.249925] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] self.driver.spawn(context, instance, image_meta, [ 639.249925] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 639.249925] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 639.249925] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 639.249925] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] self._fetch_image_if_missing(context, vi) [ 639.249925] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] image_cache(vi, tmp_image_ds_loc) [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] vm_util.copy_virtual_disk( [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] session._wait_for_task(vmdk_copy_task) [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] return self.wait_for_task(task_ref) [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] return evt.wait() [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] result = hub.switch() [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 639.250452] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] return self.greenlet.switch() [ 639.250822] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 639.250822] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] self.f(*self.args, **self.kw) [ 639.250822] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 639.250822] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] raise exceptions.translate_fault(task_info.error) [ 639.250822] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 639.250822] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Faults: ['InvalidArgument'] [ 639.250822] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] [ 639.250822] env[59379]: INFO nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Terminating instance [ 639.251992] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 639.252200] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 639.252832] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 639.253017] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 639.253842] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1b3b8eff-f4d9-45f0-967b-8ba8d2f3f08b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.255677] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0220d3c6-0177-4db1-ae32-85fbd1a15379 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.267885] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 639.271332] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-733aeb52-50f0-4497-98ac-24e77b34842c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.272743] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 639.272911] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 639.273722] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c6886568-9de9-4e44-9f75-1e5516b73ce1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.278690] env[59379]: DEBUG oslo_vmware.api [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Waiting for the task: (returnval){ [ 639.278690] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52948da2-4b26-6457-0481-55a8602d3205" [ 639.278690] env[59379]: _type = "Task" [ 639.278690] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 639.287563] env[59379]: DEBUG oslo_vmware.api [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52948da2-4b26-6457-0481-55a8602d3205, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 639.348018] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 639.348018] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 639.348018] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Deleting the datastore file [datastore2] 624ec0e2-c230-4469-8ffe-047f914793b1 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 639.348018] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4da1c175-a741-436a-bd9e-0ef5424a20e0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.356243] env[59379]: DEBUG oslo_vmware.api [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Waiting for the task: (returnval){ [ 639.356243] env[59379]: value = "task-559536" [ 639.356243] env[59379]: _type = "Task" [ 639.356243] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 639.370743] env[59379]: DEBUG oslo_vmware.api [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Task: {'id': task-559536, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 639.792727] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 639.792727] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Creating directory with path [datastore2] vmware_temp/f3504d29-a435-472f-aa49-8cfe7b0ab267/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 639.792727] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-af994c03-7f5a-4117-96b1-fc559ea370ac {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.803627] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Created directory with path [datastore2] vmware_temp/f3504d29-a435-472f-aa49-8cfe7b0ab267/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 639.803747] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Fetch image to [datastore2] vmware_temp/f3504d29-a435-472f-aa49-8cfe7b0ab267/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 639.803975] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/f3504d29-a435-472f-aa49-8cfe7b0ab267/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 639.804664] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0818eeba-7a0e-4224-af46-3a0d7969baef {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.811567] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a8664a9-e444-404b-a7cf-3f1c894dca9b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.820860] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74bc2951-f788-4183-9c91-f7e928e8eba1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.855542] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1447e6a-74dd-4045-8bc3-f62c46d4c5d1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.873324] env[59379]: DEBUG oslo_vmware.api [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Task: {'id': task-559536, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068728} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 639.873896] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 639.874453] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 639.874453] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 639.874742] env[59379]: INFO nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Took 0.62 seconds to destroy the instance on the hypervisor. [ 639.877560] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1a1b1925-232f-4e9c-870e-0d50e13fca29 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 639.881143] env[59379]: DEBUG nova.compute.claims [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 639.881143] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.881143] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.903985] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 640.005671] env[59379]: DEBUG oslo_vmware.rw_handles [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f3504d29-a435-472f-aa49-8cfe7b0ab267/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 640.066940] env[59379]: DEBUG oslo_vmware.rw_handles [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 640.066940] env[59379]: DEBUG oslo_vmware.rw_handles [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f3504d29-a435-472f-aa49-8cfe7b0ab267/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 640.206439] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-133d2829-c0f3-46f0-a7e8-46ecc7f15f22 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.216074] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f50d9936-436c-41c3-8f1c-897899a53432 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.250400] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4b32be9-e46c-40c6-aa0e-ef07d8db2405 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.258642] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdd387c3-066f-4f3e-9fe0-cae440298bca {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.272960] env[59379]: DEBUG nova.compute.provider_tree [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 640.282825] env[59379]: DEBUG nova.scheduler.client.report [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 640.299217] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.419s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.300322] env[59379]: ERROR nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 640.300322] env[59379]: Faults: ['InvalidArgument'] [ 640.300322] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Traceback (most recent call last): [ 640.300322] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 640.300322] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] self.driver.spawn(context, instance, image_meta, [ 640.300322] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 640.300322] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 640.300322] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 640.300322] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] self._fetch_image_if_missing(context, vi) [ 640.300322] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 640.300322] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] image_cache(vi, tmp_image_ds_loc) [ 640.300322] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] vm_util.copy_virtual_disk( [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] session._wait_for_task(vmdk_copy_task) [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] return self.wait_for_task(task_ref) [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] return evt.wait() [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] result = hub.switch() [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] return self.greenlet.switch() [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 640.300752] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] self.f(*self.args, **self.kw) [ 640.301205] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 640.301205] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] raise exceptions.translate_fault(task_info.error) [ 640.301205] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 640.301205] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Faults: ['InvalidArgument'] [ 640.301205] env[59379]: ERROR nova.compute.manager [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] [ 640.301205] env[59379]: DEBUG nova.compute.utils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 640.303441] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Build of instance 624ec0e2-c230-4469-8ffe-047f914793b1 was re-scheduled: A specified parameter was not correct: fileType [ 640.303441] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 640.303830] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 640.304017] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 640.304877] env[59379]: DEBUG nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 640.304877] env[59379]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 641.362257] env[59379]: DEBUG nova.network.neutron [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 641.382012] env[59379]: INFO nova.compute.manager [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] Took 1.08 seconds to deallocate network for instance. [ 641.502499] env[59379]: INFO nova.scheduler.client.report [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Deleted allocations for instance 624ec0e2-c230-4469-8ffe-047f914793b1 [ 641.527517] env[59379]: DEBUG oslo_concurrency.lockutils [None req-589318ae-4da4-4d37-a3b4-7b5c204a9fcf tempest-ServersTestFqdnHostnames-1996090690 tempest-ServersTestFqdnHostnames-1996090690-project-member] Lock "624ec0e2-c230-4469-8ffe-047f914793b1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 58.108s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.528700] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "624ec0e2-c230-4469-8ffe-047f914793b1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 47.683s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.528885] env[59379]: INFO nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 624ec0e2-c230-4469-8ffe-047f914793b1] During sync_power_state the instance has a pending task (spawning). Skip. [ 641.529176] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "624ec0e2-c230-4469-8ffe-047f914793b1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.562739] env[59379]: DEBUG nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 641.638866] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.638866] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.643143] env[59379]: INFO nova.compute.claims [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 641.925872] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a87fa9cc-25a9-4c13-aaca-a695ae4bac86 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.935163] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eca38860-3095-423c-abc5-58ee89bcf524 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.969295] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f722d29-dd06-4d79-9b83-ae62c7eb0e33 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.978506] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31480299-4bda-4e03-989a-0b8e72e14284 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.000297] env[59379]: DEBUG nova.compute.provider_tree [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 642.005335] env[59379]: DEBUG nova.scheduler.client.report [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 642.023597] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.385s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 642.024181] env[59379]: DEBUG nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 642.062612] env[59379]: DEBUG nova.compute.utils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 642.068451] env[59379]: DEBUG nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Not allocating networking since 'none' was specified. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 642.076834] env[59379]: DEBUG nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 642.162643] env[59379]: DEBUG nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 642.192676] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 642.192932] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 642.193037] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 642.193208] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 642.193340] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 642.193473] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 642.196524] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 642.196757] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 642.197485] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 642.197663] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 642.197894] env[59379]: DEBUG nova.virt.hardware [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 642.200790] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c524b93c-93b3-4db6-a157-2bc1a7cd8a8f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.211618] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf0a4eb8-5a08-4222-bfc1-6ad568ff451a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.234788] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Instance VIF info [] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 642.245958] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Creating folder: Project (5ea8a2bfcb214fecb3a7afea860a90da). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 642.247347] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e315e55f-dfd5-49bd-ac8e-5aa910d20268 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.261301] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Created folder: Project (5ea8a2bfcb214fecb3a7afea860a90da) in parent group-v140509. [ 642.261500] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Creating folder: Instances. Parent ref: group-v140540. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 642.261721] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a7e3fb44-eeb3-40e2-9720-ff1256085ec1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.271537] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Created folder: Instances in parent group-v140540. [ 642.271757] env[59379]: DEBUG oslo.service.loopingcall [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 642.271931] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 642.272121] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f9ed90aa-8af3-4bcb-a7fc-e933fdeb4775 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.292548] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 642.292548] env[59379]: value = "task-559539" [ 642.292548] env[59379]: _type = "Task" [ 642.292548] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 642.300124] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559539, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 642.805518] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559539, 'name': CreateVM_Task} progress is 99%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 643.307662] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559539, 'name': CreateVM_Task} progress is 99%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 643.810471] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559539, 'name': CreateVM_Task, 'duration_secs': 1.247393} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 643.811780] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 643.812188] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 643.812363] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 643.812675] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 643.812912] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c82406bc-6ca8-4c3c-9404-f8a93579000f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.817531] env[59379]: DEBUG oslo_vmware.api [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Waiting for the task: (returnval){ [ 643.817531] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]5216405c-2c51-301b-0a54-4e3eb9d2b0d7" [ 643.817531] env[59379]: _type = "Task" [ 643.817531] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 643.825366] env[59379]: DEBUG oslo_vmware.api [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]5216405c-2c51-301b-0a54-4e3eb9d2b0d7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 644.335682] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 644.335682] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 644.335682] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 658.000738] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 658.001191] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 658.034483] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 658.034483] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 658.034483] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 658.093606] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 658.097334] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 658.097334] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 658.097334] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 658.097334] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 658.097334] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 658.097723] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 658.097723] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 658.097723] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 658.097723] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 658.097723] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Didn't find any instances for network info cache update. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 658.097942] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 658.097942] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 658.097942] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 658.111565] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 658.111771] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.111932] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 658.112094] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 658.113287] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c584dd9e-1ece-4e9d-9735-939eab126029 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.124677] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b9514d2-b39c-4580-a6fb-74d612e24df4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.142757] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5834c1bc-a5b6-4552-957d-332f03812f7a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.151171] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22f3a93e-8dae-4910-92a8-d09e8db3129b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.185711] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181773MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 658.185907] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 658.186664] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.272849] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance ccaea0a9-59d6-456a-9885-2b90abf30abb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 658.272849] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance b9ffb5d9-8d56-4980-9e78-1e003cd56f7e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 658.272849] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance d74c7de4-5126-483f-9576-89e0007310b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 658.272849] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 658.273087] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 294a5f91-9db2-4a43-8230-d3e6906c30f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 658.273087] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 658.273087] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 19253198-cb6e-4c48-a88b-26780f3606e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 658.273087] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 658.273230] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 71554abb-780c-4681-909f-8ff93712c82e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 658.273230] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 658.307169] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 14e395c0-3650-40d6-82f1-1bd8f0b29984 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 658.308506] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 658.308506] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 658.540619] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2a9238c-6319-408e-915a-e9e306bb4c2c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.551085] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ba84e54-1050-4031-88b1-2d3187077d80 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.596606] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03055237-23dc-4bbb-a6b7-85c95e599699 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.608443] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8749905-0abe-4aef-88b4-7aef52caecef {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.629802] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 658.653778] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 658.677290] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 658.677504] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.491s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 659.016460] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 659.016827] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 659.433606] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 659.434773] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 659.434773] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 667.516076] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "238825ed-3715-444c-be7c-f42f3884df7c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.516342] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "238825ed-3715-444c-be7c-f42f3884df7c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.568548] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "2342e3da-6d68-466a-9140-ced4eeda73d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.569029] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "2342e3da-6d68-466a-9140-ced4eeda73d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.832124] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "8264a1ad-cf20-404f-9d30-30c126e0c222" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.832124] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "8264a1ad-cf20-404f-9d30-30c126e0c222" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.775964] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.777283] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 671.578420] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "a6ff207e-a925-46d1-9aaf-e06268d3c6f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 671.578531] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "a6ff207e-a925-46d1-9aaf-e06268d3c6f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.209079] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "03742e11-0fb2-48e2-9093-77ea7b647bf3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.209363] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "03742e11-0fb2-48e2-9093-77ea7b647bf3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.171546] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "05010bc2-c30a-49bf-8daa-3eec6a5e9022" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.173898] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "05010bc2-c30a-49bf-8daa-3eec6a5e9022" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.290345] env[59379]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Acquiring lock "5df12084-5dd6-41d1-9743-747f17ce3323" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.290345] env[59379]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "5df12084-5dd6-41d1-9743-747f17ce3323" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.881404] env[59379]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Acquiring lock "2ed6496a-3e75-4cfd-88da-9e0b731f738a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.881744] env[59379]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "2ed6496a-3e75-4cfd-88da-9e0b731f738a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.338998] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Acquiring lock "49d76773-e163-440b-aa99-08c379155149" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.339262] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "49d76773-e163-440b-aa99-08c379155149" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.471915] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Acquiring lock "dac8465a-592f-461c-af5b-49369eed5e70" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.472321] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "dac8465a-592f-461c-af5b-49369eed5e70" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.890441] env[59379]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "54605814-fdf4-43c7-9316-0d2594cdb5fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.890670] env[59379]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "54605814-fdf4-43c7-9316-0d2594cdb5fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.601662] env[59379]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Acquiring lock "f196648e-0e82-4a01-91fc-af1ba61f0490" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.601929] env[59379]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "f196648e-0e82-4a01-91fc-af1ba61f0490" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.747585] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Acquiring lock "66420486-d25e-457d-94cd-6f96fca2df7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.747946] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "66420486-d25e-457d-94cd-6f96fca2df7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.144990] env[59379]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Acquiring lock "a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.145345] env[59379]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.740600] env[59379]: WARNING oslo_vmware.rw_handles [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 688.740600] env[59379]: ERROR oslo_vmware.rw_handles [ 688.741243] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/f3504d29-a435-472f-aa49-8cfe7b0ab267/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 688.742662] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 688.742901] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Copying Virtual Disk [datastore2] vmware_temp/f3504d29-a435-472f-aa49-8cfe7b0ab267/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/f3504d29-a435-472f-aa49-8cfe7b0ab267/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 688.743229] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a4286726-d3c7-40a7-b9db-b69a63828180 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.750916] env[59379]: DEBUG oslo_vmware.api [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Waiting for the task: (returnval){ [ 688.750916] env[59379]: value = "task-559551" [ 688.750916] env[59379]: _type = "Task" [ 688.750916] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 688.759993] env[59379]: DEBUG oslo_vmware.api [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Task: {'id': task-559551, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 689.261150] env[59379]: DEBUG oslo_vmware.exceptions [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 689.261393] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 689.261917] env[59379]: ERROR nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 689.261917] env[59379]: Faults: ['InvalidArgument'] [ 689.261917] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Traceback (most recent call last): [ 689.261917] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 689.261917] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] yield resources [ 689.261917] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 689.261917] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] self.driver.spawn(context, instance, image_meta, [ 689.261917] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 689.261917] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 689.261917] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 689.261917] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] self._fetch_image_if_missing(context, vi) [ 689.261917] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] image_cache(vi, tmp_image_ds_loc) [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] vm_util.copy_virtual_disk( [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] session._wait_for_task(vmdk_copy_task) [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] return self.wait_for_task(task_ref) [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] return evt.wait() [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] result = hub.switch() [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 689.262354] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] return self.greenlet.switch() [ 689.262720] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 689.262720] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] self.f(*self.args, **self.kw) [ 689.262720] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 689.262720] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] raise exceptions.translate_fault(task_info.error) [ 689.262720] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 689.262720] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Faults: ['InvalidArgument'] [ 689.262720] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] [ 689.262720] env[59379]: INFO nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Terminating instance [ 689.263774] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 689.263969] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 689.264581] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 689.264759] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 689.265012] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a555e63c-40f0-42af-b4a9-f9c92c9827f8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.267429] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1aaff344-80d0-416e-b00c-66a559d79b70 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.274233] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 689.274440] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5de8259b-ce4d-4b6b-80bf-17957d5a1e09 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.276507] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 689.276670] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 689.277619] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1ee79d42-2578-4c85-a6c0-c880381945f5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.283145] env[59379]: DEBUG oslo_vmware.api [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Waiting for the task: (returnval){ [ 689.283145] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]5277af24-68fe-320c-aa90-2b20898f990a" [ 689.283145] env[59379]: _type = "Task" [ 689.283145] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 689.290388] env[59379]: DEBUG oslo_vmware.api [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]5277af24-68fe-320c-aa90-2b20898f990a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 689.342199] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 689.342482] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 689.342583] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Deleting the datastore file [datastore2] ccaea0a9-59d6-456a-9885-2b90abf30abb {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 689.342811] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6bb336b2-4b3a-4e23-b755-7e600121ce15 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.348413] env[59379]: DEBUG oslo_vmware.api [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Waiting for the task: (returnval){ [ 689.348413] env[59379]: value = "task-559553" [ 689.348413] env[59379]: _type = "Task" [ 689.348413] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 689.356423] env[59379]: DEBUG oslo_vmware.api [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Task: {'id': task-559553, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 689.793524] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 689.793793] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Creating directory with path [datastore2] vmware_temp/ec86762c-e2ed-4efe-ba85-51dd5e77b2bb/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 689.793885] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e863eee9-b793-4ad4-963c-436793c7557d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.806755] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Created directory with path [datastore2] vmware_temp/ec86762c-e2ed-4efe-ba85-51dd5e77b2bb/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 689.807213] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Fetch image to [datastore2] vmware_temp/ec86762c-e2ed-4efe-ba85-51dd5e77b2bb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 689.807386] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/ec86762c-e2ed-4efe-ba85-51dd5e77b2bb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 689.808343] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a218099-0890-4ca5-a253-062fa0b10a42 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.815546] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d7a1d0e-53f4-4676-acbf-c1d717631f22 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.825084] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f717b14-6478-4c07-bd76-d0982d44f58a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.858717] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4bbe352-d5cc-452f-a156-50dc37fee5d1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.867102] env[59379]: DEBUG oslo_vmware.api [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Task: {'id': task-559553, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067599} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 689.868094] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 689.868275] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 689.868433] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 689.868592] env[59379]: INFO nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 689.870374] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3c58702e-df29-4120-8bb3-e9187bfc91c8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.872243] env[59379]: DEBUG nova.compute.claims [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 689.872404] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.872603] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.905936] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 689.956892] env[59379]: DEBUG oslo_vmware.rw_handles [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ec86762c-e2ed-4efe-ba85-51dd5e77b2bb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 690.017498] env[59379]: DEBUG oslo_vmware.rw_handles [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 690.017672] env[59379]: DEBUG oslo_vmware.rw_handles [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ec86762c-e2ed-4efe-ba85-51dd5e77b2bb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 690.235017] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1525dda2-ce60-4e58-aad6-646d25aaeb72 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.242528] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7319cfb6-fcf2-499b-a8dd-000d498cf947 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.271980] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e6b14f4-8033-419b-84ef-8c7e01478662 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.278996] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca299653-4303-4b70-a868-8f1d417fa2cb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.291936] env[59379]: DEBUG nova.compute.provider_tree [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 690.299944] env[59379]: DEBUG nova.scheduler.client.report [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 690.316803] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.444s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.317340] env[59379]: ERROR nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 690.317340] env[59379]: Faults: ['InvalidArgument'] [ 690.317340] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Traceback (most recent call last): [ 690.317340] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 690.317340] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] self.driver.spawn(context, instance, image_meta, [ 690.317340] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 690.317340] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 690.317340] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 690.317340] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] self._fetch_image_if_missing(context, vi) [ 690.317340] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 690.317340] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] image_cache(vi, tmp_image_ds_loc) [ 690.317340] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] vm_util.copy_virtual_disk( [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] session._wait_for_task(vmdk_copy_task) [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] return self.wait_for_task(task_ref) [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] return evt.wait() [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] result = hub.switch() [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] return self.greenlet.switch() [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 690.317665] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] self.f(*self.args, **self.kw) [ 690.318046] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 690.318046] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] raise exceptions.translate_fault(task_info.error) [ 690.318046] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 690.318046] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Faults: ['InvalidArgument'] [ 690.318046] env[59379]: ERROR nova.compute.manager [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] [ 690.318046] env[59379]: DEBUG nova.compute.utils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 690.319391] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Build of instance ccaea0a9-59d6-456a-9885-2b90abf30abb was re-scheduled: A specified parameter was not correct: fileType [ 690.319391] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 690.319737] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 690.319931] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 690.320111] env[59379]: DEBUG nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 690.320270] env[59379]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 690.721105] env[59379]: DEBUG nova.network.neutron [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 690.730911] env[59379]: INFO nova.compute.manager [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] Took 0.41 seconds to deallocate network for instance. [ 690.816772] env[59379]: INFO nova.scheduler.client.report [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Deleted allocations for instance ccaea0a9-59d6-456a-9885-2b90abf30abb [ 690.839053] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9a868568-4b32-435f-a6f7-43caf74fac96 tempest-ImagesOneServerNegativeTestJSON-88470720 tempest-ImagesOneServerNegativeTestJSON-88470720-project-member] Lock "ccaea0a9-59d6-456a-9885-2b90abf30abb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 106.570s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.839423] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "ccaea0a9-59d6-456a-9885-2b90abf30abb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 96.993s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.839423] env[59379]: INFO nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: ccaea0a9-59d6-456a-9885-2b90abf30abb] During sync_power_state the instance has a pending task (spawning). Skip. [ 690.839423] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "ccaea0a9-59d6-456a-9885-2b90abf30abb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.879208] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 690.926574] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.926816] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.928342] env[59379]: INFO nova.compute.claims [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 691.293169] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-964f8d04-0fe7-4bdd-bfe8-939514a8ea83 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.301164] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e4d714d-a67f-4f3e-86c1-1b1c3d37f46b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.331879] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a597fac-dd77-4cff-bbc5-964f0667c34d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.341024] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8847fdcb-3b48-41cf-a586-614c6d4a76ff {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.353061] env[59379]: DEBUG nova.compute.provider_tree [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 691.362052] env[59379]: DEBUG nova.scheduler.client.report [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 691.375647] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.449s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.376136] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 691.409544] env[59379]: DEBUG nova.compute.utils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 691.411267] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 691.411436] env[59379]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 691.422433] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 691.488324] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 691.511022] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 691.511249] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 691.511366] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 691.511545] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 691.511686] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 691.511827] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 691.512256] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 691.512460] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 691.512669] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 691.512836] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 691.513050] env[59379]: DEBUG nova.virt.hardware [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 691.513903] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53295047-cb02-45c5-9ada-902325bf1b0f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.521837] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c574a6c-ceee-4e78-9c03-f821189aa98f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 691.717399] env[59379]: DEBUG nova.policy [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5cffd1ab1a0f45499abb7a5818170152', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6877b048a8b4486bbbc359726a58f5e6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 692.364798] env[59379]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Successfully created port: 1862cb53-a896-4800-9a8a-86bbac11eeb1 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 693.208134] env[59379]: DEBUG nova.compute.manager [req-43653c73-7024-4d11-92db-1edc8fd07c11 req-97b34e6b-9304-4e45-9e25-8e36d153ea0c service nova] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Received event network-vif-plugged-1862cb53-a896-4800-9a8a-86bbac11eeb1 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 693.208134] env[59379]: DEBUG oslo_concurrency.lockutils [req-43653c73-7024-4d11-92db-1edc8fd07c11 req-97b34e6b-9304-4e45-9e25-8e36d153ea0c service nova] Acquiring lock "14e395c0-3650-40d6-82f1-1bd8f0b29984-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 693.208134] env[59379]: DEBUG oslo_concurrency.lockutils [req-43653c73-7024-4d11-92db-1edc8fd07c11 req-97b34e6b-9304-4e45-9e25-8e36d153ea0c service nova] Lock "14e395c0-3650-40d6-82f1-1bd8f0b29984-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.208134] env[59379]: DEBUG oslo_concurrency.lockutils [req-43653c73-7024-4d11-92db-1edc8fd07c11 req-97b34e6b-9304-4e45-9e25-8e36d153ea0c service nova] Lock "14e395c0-3650-40d6-82f1-1bd8f0b29984-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 693.208259] env[59379]: DEBUG nova.compute.manager [req-43653c73-7024-4d11-92db-1edc8fd07c11 req-97b34e6b-9304-4e45-9e25-8e36d153ea0c service nova] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] No waiting events found dispatching network-vif-plugged-1862cb53-a896-4800-9a8a-86bbac11eeb1 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 693.208259] env[59379]: WARNING nova.compute.manager [req-43653c73-7024-4d11-92db-1edc8fd07c11 req-97b34e6b-9304-4e45-9e25-8e36d153ea0c service nova] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Received unexpected event network-vif-plugged-1862cb53-a896-4800-9a8a-86bbac11eeb1 for instance with vm_state building and task_state spawning. [ 693.350652] env[59379]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Successfully updated port: 1862cb53-a896-4800-9a8a-86bbac11eeb1 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 693.367600] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "refresh_cache-14e395c0-3650-40d6-82f1-1bd8f0b29984" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 693.367600] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquired lock "refresh_cache-14e395c0-3650-40d6-82f1-1bd8f0b29984" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 693.367600] env[59379]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 693.412417] env[59379]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 693.676852] env[59379]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Updating instance_info_cache with network_info: [{"id": "1862cb53-a896-4800-9a8a-86bbac11eeb1", "address": "fa:16:3e:de:7b:e9", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.66", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1862cb53-a8", "ovs_interfaceid": "1862cb53-a896-4800-9a8a-86bbac11eeb1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.688480] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Releasing lock "refresh_cache-14e395c0-3650-40d6-82f1-1bd8f0b29984" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 693.688941] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Instance network_info: |[{"id": "1862cb53-a896-4800-9a8a-86bbac11eeb1", "address": "fa:16:3e:de:7b:e9", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.66", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1862cb53-a8", "ovs_interfaceid": "1862cb53-a896-4800-9a8a-86bbac11eeb1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 693.689753] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:de:7b:e9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '778b9a40-d603-4765-ac88-bd6d42c457a2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1862cb53-a896-4800-9a8a-86bbac11eeb1', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 693.697832] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Creating folder: Project (6877b048a8b4486bbbc359726a58f5e6). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 693.698334] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d0a8862e-bcb7-4ba9-8726-c95291b98c2e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.710265] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Created folder: Project (6877b048a8b4486bbbc359726a58f5e6) in parent group-v140509. [ 693.710440] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Creating folder: Instances. Parent ref: group-v140547. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 693.710651] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-13b29d24-125e-48e8-8bc6-b708705d9372 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.719316] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Created folder: Instances in parent group-v140547. [ 693.719539] env[59379]: DEBUG oslo.service.loopingcall [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 693.719773] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 693.719906] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-83362184-1114-4762-8eda-b351d3b12136 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 693.738621] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 693.738621] env[59379]: value = "task-559556" [ 693.738621] env[59379]: _type = "Task" [ 693.738621] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 693.746126] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559556, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 694.250138] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559556, 'name': CreateVM_Task, 'duration_secs': 0.322417} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 694.250304] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 694.251374] env[59379]: DEBUG oslo_vmware.service [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a3b8367-a498-4bc4-944a-b771b36092bd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.256746] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 694.256897] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 694.257273] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 694.257489] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c46a9023-41f6-4c05-879f-f977b9eb5598 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.261591] env[59379]: DEBUG oslo_vmware.api [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Waiting for the task: (returnval){ [ 694.261591] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52deb1fd-bb32-1026-78da-3e3efa208c5b" [ 694.261591] env[59379]: _type = "Task" [ 694.261591] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 694.269067] env[59379]: DEBUG oslo_vmware.api [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52deb1fd-bb32-1026-78da-3e3efa208c5b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 694.772037] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 694.772428] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 694.772560] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 694.772662] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 694.772944] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 694.773058] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-12bfbebf-de46-4746-9273-65438d1761b6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.780583] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 694.780825] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 694.781605] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ea92280-808c-415e-8ee2-3a28f3b7f676 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.787603] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dfa12a38-3433-433b-9720-aa28bc9f3194 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 694.792805] env[59379]: DEBUG oslo_vmware.api [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Waiting for the task: (returnval){ [ 694.792805] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]521f5097-1c22-8b00-9db6-9fbb80e9d24c" [ 694.792805] env[59379]: _type = "Task" [ 694.792805] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 694.800299] env[59379]: DEBUG oslo_vmware.api [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]521f5097-1c22-8b00-9db6-9fbb80e9d24c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 695.302878] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 695.303224] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Creating directory with path [datastore1] vmware_temp/9def3c0a-a587-4035-b9f7-7be9b1585b1d/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 695.303460] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a0a833ad-6871-43dc-8829-e4f40b80ddda {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.323622] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Created directory with path [datastore1] vmware_temp/9def3c0a-a587-4035-b9f7-7be9b1585b1d/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 695.323877] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Fetch image to [datastore1] vmware_temp/9def3c0a-a587-4035-b9f7-7be9b1585b1d/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 695.323987] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/9def3c0a-a587-4035-b9f7-7be9b1585b1d/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 695.325052] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbb84fef-0fa1-4d15-a8e7-62118e7d601f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.333330] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71756a2a-743a-4368-8826-256da23451b0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.342339] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68b90491-5c41-4b2b-ad1f-7b8dca64a436 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.377026] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dded6a20-5b3f-4b74-8537-3b32460771b9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.383479] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-900b8c0f-7ab2-4b57-b81c-7083777715c7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 695.387026] env[59379]: DEBUG nova.compute.manager [req-cbf77ad7-2710-4067-bb0a-e9f0c967f8eb req-b1a14b27-cae3-4bc0-91a3-49d7ade23e36 service nova] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Received event network-changed-1862cb53-a896-4800-9a8a-86bbac11eeb1 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 695.387264] env[59379]: DEBUG nova.compute.manager [req-cbf77ad7-2710-4067-bb0a-e9f0c967f8eb req-b1a14b27-cae3-4bc0-91a3-49d7ade23e36 service nova] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Refreshing instance network info cache due to event network-changed-1862cb53-a896-4800-9a8a-86bbac11eeb1. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 695.387402] env[59379]: DEBUG oslo_concurrency.lockutils [req-cbf77ad7-2710-4067-bb0a-e9f0c967f8eb req-b1a14b27-cae3-4bc0-91a3-49d7ade23e36 service nova] Acquiring lock "refresh_cache-14e395c0-3650-40d6-82f1-1bd8f0b29984" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 695.388015] env[59379]: DEBUG oslo_concurrency.lockutils [req-cbf77ad7-2710-4067-bb0a-e9f0c967f8eb req-b1a14b27-cae3-4bc0-91a3-49d7ade23e36 service nova] Acquired lock "refresh_cache-14e395c0-3650-40d6-82f1-1bd8f0b29984" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 695.388216] env[59379]: DEBUG nova.network.neutron [req-cbf77ad7-2710-4067-bb0a-e9f0c967f8eb req-b1a14b27-cae3-4bc0-91a3-49d7ade23e36 service nova] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Refreshing network info cache for port 1862cb53-a896-4800-9a8a-86bbac11eeb1 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 695.411275] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 695.467369] env[59379]: DEBUG oslo_vmware.rw_handles [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9def3c0a-a587-4035-b9f7-7be9b1585b1d/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 695.523570] env[59379]: DEBUG oslo_vmware.rw_handles [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 695.523715] env[59379]: DEBUG oslo_vmware.rw_handles [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9def3c0a-a587-4035-b9f7-7be9b1585b1d/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 695.826835] env[59379]: DEBUG nova.network.neutron [req-cbf77ad7-2710-4067-bb0a-e9f0c967f8eb req-b1a14b27-cae3-4bc0-91a3-49d7ade23e36 service nova] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Updated VIF entry in instance network info cache for port 1862cb53-a896-4800-9a8a-86bbac11eeb1. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 695.827142] env[59379]: DEBUG nova.network.neutron [req-cbf77ad7-2710-4067-bb0a-e9f0c967f8eb req-b1a14b27-cae3-4bc0-91a3-49d7ade23e36 service nova] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Updating instance_info_cache with network_info: [{"id": "1862cb53-a896-4800-9a8a-86bbac11eeb1", "address": "fa:16:3e:de:7b:e9", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.66", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1862cb53-a8", "ovs_interfaceid": "1862cb53-a896-4800-9a8a-86bbac11eeb1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 695.839273] env[59379]: DEBUG oslo_concurrency.lockutils [req-cbf77ad7-2710-4067-bb0a-e9f0c967f8eb req-b1a14b27-cae3-4bc0-91a3-49d7ade23e36 service nova] Releasing lock "refresh_cache-14e395c0-3650-40d6-82f1-1bd8f0b29984" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 716.641943] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "13aee471-4813-4376-a7bf-70f266d9a399" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.642253] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "13aee471-4813-4376-a7bf-70f266d9a399" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.431009] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 717.433620] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 717.433802] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 719.433550] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 719.433846] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 719.433846] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 719.454395] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 719.454552] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 719.454679] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 719.454800] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 719.454923] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 719.455054] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 719.455172] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 719.455287] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 719.455401] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 719.455515] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 719.455631] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Didn't find any instances for network info cache update. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 719.456113] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 719.456279] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 719.456408] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 719.456543] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 719.465719] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.465925] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.466093] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 719.466247] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 719.467632] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db0f3f77-d067-4ee1-b679-68843a8dd288 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.477198] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee48a2c7-984e-49fd-9758-11ffb76666bb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.492203] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f61777ae-943b-460a-a047-d23fa626f93f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.499439] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fddb8154-053f-4ba1-94a1-096581d0a12c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.530470] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181762MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 719.530623] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.530749] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.617425] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance b9ffb5d9-8d56-4980-9e78-1e003cd56f7e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 719.617602] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance d74c7de4-5126-483f-9576-89e0007310b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 719.617744] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 719.617867] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 294a5f91-9db2-4a43-8230-d3e6906c30f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 719.617986] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 719.618118] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 19253198-cb6e-4c48-a88b-26780f3606e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 719.618235] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 719.618347] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 71554abb-780c-4681-909f-8ff93712c82e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 719.618470] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 719.618584] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 14e395c0-3650-40d6-82f1-1bd8f0b29984 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 719.643098] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 238825ed-3715-444c-be7c-f42f3884df7c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.654545] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2342e3da-6d68-466a-9140-ced4eeda73d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.665093] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 8264a1ad-cf20-404f-9d30-30c126e0c222 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.675570] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance cf939f8d-66e3-4146-8566-2c8d06d6d6da has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.685585] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance a6ff207e-a925-46d1-9aaf-e06268d3c6f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.695990] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 03742e11-0fb2-48e2-9093-77ea7b647bf3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.706224] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 05010bc2-c30a-49bf-8daa-3eec6a5e9022 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.716270] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 5df12084-5dd6-41d1-9743-747f17ce3323 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.729716] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2ed6496a-3e75-4cfd-88da-9e0b731f738a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.742757] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 49d76773-e163-440b-aa99-08c379155149 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.751861] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance dac8465a-592f-461c-af5b-49369eed5e70 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.763168] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 54605814-fdf4-43c7-9316-0d2594cdb5fa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.773176] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance f196648e-0e82-4a01-91fc-af1ba61f0490 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.784088] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 66420486-d25e-457d-94cd-6f96fca2df7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.796034] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.805519] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 13aee471-4813-4376-a7bf-70f266d9a399 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 719.805756] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 719.805900] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 720.104252] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-546bb628-938c-4ca7-87ce-b16b80a80959 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.112182] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5ffce8a-a27c-41d2-8c82-b5dcb4e41861 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.143534] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6c31143-5f5e-4731-a07c-a71d2542bb70 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.150771] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8530f782-70f4-40f8-a779-81d0e468b9f6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.164152] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Updating inventory in ProviderTree for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 720.188744] env[59379]: ERROR nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [req-fc7d1bc0-d997-4c6e-863c-65aac6181c14] Failed to update inventory to [{'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}}] for resource provider with UUID 693f1d2b-e627-44fb-bcd5-714cccac894b. Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict ", "code": "placement.concurrent_update", "request_id": "req-fc7d1bc0-d997-4c6e-863c-65aac6181c14"}]} [ 720.204289] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Refreshing inventories for resource provider 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 720.217237] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Updating ProviderTree inventory for provider 693f1d2b-e627-44fb-bcd5-714cccac894b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 720.217444] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Updating inventory in ProviderTree for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 720.228796] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Refreshing aggregate associations for resource provider 693f1d2b-e627-44fb-bcd5-714cccac894b, aggregates: None {{(pid=59379) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 720.243817] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Refreshing trait associations for resource provider 693f1d2b-e627-44fb-bcd5-714cccac894b, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=59379) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 720.525076] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0497e65-af2a-43ae-8eda-a8332148fdb0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.533441] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddf21148-86ca-4ffd-bbf5-6214b12b823c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.563975] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79423695-4aec-482a-990b-f9a22759d147 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.571898] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-710c8663-2d3e-4782-922c-011690780ae3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.585998] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Updating inventory in ProviderTree for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 720.619403] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Updated inventory for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with generation 34 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 720.619578] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Updating resource provider 693f1d2b-e627-44fb-bcd5-714cccac894b generation from 34 to 35 during operation: update_inventory {{(pid=59379) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 720.619730] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Updating inventory in ProviderTree for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 720.633579] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 720.633751] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.103s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 721.611513] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 721.611806] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 737.656760] env[59379]: WARNING oslo_vmware.rw_handles [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 737.656760] env[59379]: ERROR oslo_vmware.rw_handles [ 737.657850] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/ec86762c-e2ed-4efe-ba85-51dd5e77b2bb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 737.658987] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 737.659302] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Copying Virtual Disk [datastore2] vmware_temp/ec86762c-e2ed-4efe-ba85-51dd5e77b2bb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/ec86762c-e2ed-4efe-ba85-51dd5e77b2bb/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 737.659593] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e7cc3970-4ad1-48d8-9830-9be52a1de61e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.667617] env[59379]: DEBUG oslo_vmware.api [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Waiting for the task: (returnval){ [ 737.667617] env[59379]: value = "task-559567" [ 737.667617] env[59379]: _type = "Task" [ 737.667617] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 737.675562] env[59379]: DEBUG oslo_vmware.api [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Task: {'id': task-559567, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 738.179086] env[59379]: DEBUG oslo_vmware.exceptions [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 738.179349] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 738.179956] env[59379]: ERROR nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 738.179956] env[59379]: Faults: ['InvalidArgument'] [ 738.179956] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] Traceback (most recent call last): [ 738.179956] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 738.179956] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] yield resources [ 738.179956] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 738.179956] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] self.driver.spawn(context, instance, image_meta, [ 738.179956] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 738.179956] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 738.179956] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 738.179956] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] self._fetch_image_if_missing(context, vi) [ 738.179956] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] image_cache(vi, tmp_image_ds_loc) [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] vm_util.copy_virtual_disk( [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] session._wait_for_task(vmdk_copy_task) [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] return self.wait_for_task(task_ref) [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] return evt.wait() [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] result = hub.switch() [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 738.180292] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] return self.greenlet.switch() [ 738.180591] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 738.180591] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] self.f(*self.args, **self.kw) [ 738.180591] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 738.180591] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] raise exceptions.translate_fault(task_info.error) [ 738.180591] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 738.180591] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] Faults: ['InvalidArgument'] [ 738.180591] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] [ 738.180591] env[59379]: INFO nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Terminating instance [ 738.181898] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 738.182166] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 738.182344] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-13a28561-0dc9-46cd-bb3d-1dd72e49b2d9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.184595] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 738.184784] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 738.185494] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2463f13-baab-4ea2-8a09-9c2aa3aed826 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.193114] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 738.193114] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c0813622-879f-47e7-b957-5e1a2bdb6f56 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.194891] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 738.195069] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 738.196011] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9642e305-8c76-4a12-9ac3-a3b1df05f4e9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.200489] env[59379]: DEBUG oslo_vmware.api [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Waiting for the task: (returnval){ [ 738.200489] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52504109-d857-8fb1-9872-72e276374c60" [ 738.200489] env[59379]: _type = "Task" [ 738.200489] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 738.207778] env[59379]: DEBUG oslo_vmware.api [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52504109-d857-8fb1-9872-72e276374c60, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 738.265287] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 738.265515] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 738.265658] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Deleting the datastore file [datastore2] d74c7de4-5126-483f-9576-89e0007310b8 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 738.265907] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-858425f6-0776-476e-b8c6-75f886250ba3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.272907] env[59379]: DEBUG oslo_vmware.api [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Waiting for the task: (returnval){ [ 738.272907] env[59379]: value = "task-559569" [ 738.272907] env[59379]: _type = "Task" [ 738.272907] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 738.280838] env[59379]: DEBUG oslo_vmware.api [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Task: {'id': task-559569, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 738.713148] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 738.713148] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Creating directory with path [datastore2] vmware_temp/1c0d83cc-80d5-44d2-a8eb-62230c793400/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 738.713148] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ad694179-fa63-4377-88d5-b45717d78efb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.722738] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Created directory with path [datastore2] vmware_temp/1c0d83cc-80d5-44d2-a8eb-62230c793400/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 738.722738] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Fetch image to [datastore2] vmware_temp/1c0d83cc-80d5-44d2-a8eb-62230c793400/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 738.722999] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/1c0d83cc-80d5-44d2-a8eb-62230c793400/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 738.723568] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e59a8c6-fcf4-42d2-85c5-0c541dfc531a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.730207] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-942ee7be-cb42-401c-91f7-fb17f69e6bb5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.739927] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ebd66c6-3a2e-456e-bb64-e3b5d713a482 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.771768] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b6a9c8d-8528-4e8c-965b-4234f29feccc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.783088] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b0012863-f5e3-403b-942c-280d7e3f6126 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 738.784803] env[59379]: DEBUG oslo_vmware.api [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Task: {'id': task-559569, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073331} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 738.785047] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 738.785224] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 738.785386] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 738.785547] env[59379]: INFO nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 738.788927] env[59379]: DEBUG nova.compute.claims [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 738.789108] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 738.789329] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 738.804959] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 738.854848] env[59379]: DEBUG oslo_vmware.rw_handles [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1c0d83cc-80d5-44d2-a8eb-62230c793400/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 738.915638] env[59379]: DEBUG oslo_vmware.rw_handles [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 738.915859] env[59379]: DEBUG oslo_vmware.rw_handles [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1c0d83cc-80d5-44d2-a8eb-62230c793400/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 739.156841] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f157434-335e-4022-9319-f220b0bd516c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.164176] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29ae35f5-8657-4c86-9b45-5d8ed4a24dc9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.193591] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41ab639b-a313-47c9-bb70-8ee18e05ac51 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.200562] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a380daa-13ea-49af-942f-e78a1f8fa878 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 739.214101] env[59379]: DEBUG nova.compute.provider_tree [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Updating inventory in ProviderTree for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 739.245154] env[59379]: DEBUG nova.scheduler.client.report [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Updated inventory for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with generation 35 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 739.245446] env[59379]: DEBUG nova.compute.provider_tree [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Updating resource provider 693f1d2b-e627-44fb-bcd5-714cccac894b generation from 35 to 36 during operation: update_inventory {{(pid=59379) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 739.245653] env[59379]: DEBUG nova.compute.provider_tree [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Updating inventory in ProviderTree for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 739.259584] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.470s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.259799] env[59379]: ERROR nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 739.259799] env[59379]: Faults: ['InvalidArgument'] [ 739.259799] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] Traceback (most recent call last): [ 739.259799] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 739.259799] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] self.driver.spawn(context, instance, image_meta, [ 739.259799] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 739.259799] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 739.259799] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 739.259799] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] self._fetch_image_if_missing(context, vi) [ 739.259799] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 739.259799] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] image_cache(vi, tmp_image_ds_loc) [ 739.259799] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] vm_util.copy_virtual_disk( [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] session._wait_for_task(vmdk_copy_task) [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] return self.wait_for_task(task_ref) [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] return evt.wait() [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] result = hub.switch() [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] return self.greenlet.switch() [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 739.260170] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] self.f(*self.args, **self.kw) [ 739.260538] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 739.260538] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] raise exceptions.translate_fault(task_info.error) [ 739.260538] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 739.260538] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] Faults: ['InvalidArgument'] [ 739.260538] env[59379]: ERROR nova.compute.manager [instance: d74c7de4-5126-483f-9576-89e0007310b8] [ 739.260538] env[59379]: DEBUG nova.compute.utils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 739.262216] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Build of instance d74c7de4-5126-483f-9576-89e0007310b8 was re-scheduled: A specified parameter was not correct: fileType [ 739.262216] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 739.263020] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 739.263143] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 739.263341] env[59379]: DEBUG nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 739.263588] env[59379]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 739.639264] env[59379]: DEBUG nova.network.neutron [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 739.651174] env[59379]: INFO nova.compute.manager [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] [instance: d74c7de4-5126-483f-9576-89e0007310b8] Took 0.38 seconds to deallocate network for instance. [ 739.745858] env[59379]: INFO nova.scheduler.client.report [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Deleted allocations for instance d74c7de4-5126-483f-9576-89e0007310b8 [ 739.767221] env[59379]: DEBUG oslo_concurrency.lockutils [None req-937591e0-6edb-4f12-92c5-69e525c19672 tempest-ServerDiagnosticsNegativeTest-683359294 tempest-ServerDiagnosticsNegativeTest-683359294-project-member] Lock "d74c7de4-5126-483f-9576-89e0007310b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 152.839s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.768297] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "d74c7de4-5126-483f-9576-89e0007310b8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 145.922s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.768480] env[59379]: INFO nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: d74c7de4-5126-483f-9576-89e0007310b8] During sync_power_state the instance has a pending task (spawning). Skip. [ 739.768641] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "d74c7de4-5126-483f-9576-89e0007310b8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 739.778727] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 739.831975] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 739.832314] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.834241] env[59379]: INFO nova.compute.claims [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 740.188100] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-095a83bb-c2f0-443e-906d-010da27b1893 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.195892] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb5bc61c-fd5f-49bc-92b6-339b2ff6d1fc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.227531] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-447a1c9e-970e-4a47-9ed9-64cdd8ba16bb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.235053] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4949ed7-e7fc-4f86-844e-4de22bc58b9d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.248087] env[59379]: DEBUG nova.compute.provider_tree [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 740.256270] env[59379]: DEBUG nova.scheduler.client.report [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 740.272750] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.440s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.273458] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 740.309159] env[59379]: DEBUG nova.compute.utils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 740.312958] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 740.312958] env[59379]: DEBUG nova.network.neutron [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 740.318649] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 740.397062] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 740.401216] env[59379]: DEBUG nova.policy [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e326832f5f0244e28c495002df50b11d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '171afaa5f3e84fce99d714d965673aab', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 740.420214] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 740.420448] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 740.420652] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 740.420849] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 740.420995] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 740.421252] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 740.421460] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 740.421613] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 740.421769] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 740.421925] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 740.422107] env[59379]: DEBUG nova.virt.hardware [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 740.423192] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b0fdff6-d7f8-4264-b2f9-337ee94e88b0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.431424] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeaf63f5-d07d-4524-b8b4-6f8fd992c9d5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.030708] env[59379]: DEBUG nova.network.neutron [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Successfully created port: 457011f1-233f-4316-bfbf-dbda2457934a {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 741.316342] env[59379]: WARNING oslo_vmware.rw_handles [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 741.316342] env[59379]: ERROR oslo_vmware.rw_handles [ 741.316825] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/9def3c0a-a587-4035-b9f7-7be9b1585b1d/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 741.318289] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 741.318549] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Copying Virtual Disk [datastore1] vmware_temp/9def3c0a-a587-4035-b9f7-7be9b1585b1d/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/9def3c0a-a587-4035-b9f7-7be9b1585b1d/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 741.318862] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2ed0aab5-1f8f-411c-8c96-ecf59a48a5df {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.326894] env[59379]: DEBUG oslo_vmware.api [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Waiting for the task: (returnval){ [ 741.326894] env[59379]: value = "task-559570" [ 741.326894] env[59379]: _type = "Task" [ 741.326894] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 741.335343] env[59379]: DEBUG oslo_vmware.api [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Task: {'id': task-559570, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 741.841897] env[59379]: DEBUG oslo_vmware.exceptions [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 741.842979] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 741.843764] env[59379]: ERROR nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 741.843764] env[59379]: Faults: ['InvalidArgument'] [ 741.843764] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Traceback (most recent call last): [ 741.843764] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 741.843764] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] yield resources [ 741.843764] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 741.843764] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] self.driver.spawn(context, instance, image_meta, [ 741.843764] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 741.843764] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] self._vmops.spawn(context, instance, image_meta, injected_files, [ 741.843764] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 741.843764] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] self._fetch_image_if_missing(context, vi) [ 741.843764] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] image_cache(vi, tmp_image_ds_loc) [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] vm_util.copy_virtual_disk( [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] session._wait_for_task(vmdk_copy_task) [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] return self.wait_for_task(task_ref) [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] return evt.wait() [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] result = hub.switch() [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 741.845042] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] return self.greenlet.switch() [ 741.845492] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 741.845492] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] self.f(*self.args, **self.kw) [ 741.845492] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 741.845492] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] raise exceptions.translate_fault(task_info.error) [ 741.845492] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 741.845492] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Faults: ['InvalidArgument'] [ 741.845492] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] [ 741.845492] env[59379]: INFO nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Terminating instance [ 741.847480] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 741.848393] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 741.849229] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d6d5095-b244-4473-b823-dd6e60e52d69 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.858529] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 741.858770] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c9426eb7-c9b6-4670-ad2c-c6b1ad877992 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.939636] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 741.939868] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 741.940060] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Deleting the datastore file [datastore1] 14e395c0-3650-40d6-82f1-1bd8f0b29984 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 741.940301] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-33e592f8-fa13-4e54-9d46-a8e8f767e796 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.947580] env[59379]: DEBUG oslo_vmware.api [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Waiting for the task: (returnval){ [ 741.947580] env[59379]: value = "task-559572" [ 741.947580] env[59379]: _type = "Task" [ 741.947580] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 741.955555] env[59379]: DEBUG oslo_vmware.api [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Task: {'id': task-559572, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 742.266650] env[59379]: DEBUG nova.network.neutron [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Successfully updated port: 457011f1-233f-4316-bfbf-dbda2457934a {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 742.285105] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "refresh_cache-238825ed-3715-444c-be7c-f42f3884df7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 742.285303] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquired lock "refresh_cache-238825ed-3715-444c-be7c-f42f3884df7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 742.285489] env[59379]: DEBUG nova.network.neutron [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 742.295510] env[59379]: DEBUG nova.compute.manager [req-307557e8-f014-4721-9024-ef558863caf8 req-fe97ff55-011b-460d-8a03-07dfbbf02d71 service nova] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Received event network-vif-plugged-457011f1-233f-4316-bfbf-dbda2457934a {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 742.295710] env[59379]: DEBUG oslo_concurrency.lockutils [req-307557e8-f014-4721-9024-ef558863caf8 req-fe97ff55-011b-460d-8a03-07dfbbf02d71 service nova] Acquiring lock "238825ed-3715-444c-be7c-f42f3884df7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.295941] env[59379]: DEBUG oslo_concurrency.lockutils [req-307557e8-f014-4721-9024-ef558863caf8 req-fe97ff55-011b-460d-8a03-07dfbbf02d71 service nova] Lock "238825ed-3715-444c-be7c-f42f3884df7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.296167] env[59379]: DEBUG oslo_concurrency.lockutils [req-307557e8-f014-4721-9024-ef558863caf8 req-fe97ff55-011b-460d-8a03-07dfbbf02d71 service nova] Lock "238825ed-3715-444c-be7c-f42f3884df7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 742.296300] env[59379]: DEBUG nova.compute.manager [req-307557e8-f014-4721-9024-ef558863caf8 req-fe97ff55-011b-460d-8a03-07dfbbf02d71 service nova] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] No waiting events found dispatching network-vif-plugged-457011f1-233f-4316-bfbf-dbda2457934a {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 742.296415] env[59379]: WARNING nova.compute.manager [req-307557e8-f014-4721-9024-ef558863caf8 req-fe97ff55-011b-460d-8a03-07dfbbf02d71 service nova] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Received unexpected event network-vif-plugged-457011f1-233f-4316-bfbf-dbda2457934a for instance with vm_state building and task_state spawning. [ 742.356429] env[59379]: DEBUG nova.network.neutron [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 742.457250] env[59379]: DEBUG oslo_vmware.api [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Task: {'id': task-559572, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074305} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 742.457473] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 742.457641] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 742.457903] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 742.458089] env[59379]: INFO nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Took 0.61 seconds to destroy the instance on the hypervisor. [ 742.460256] env[59379]: DEBUG nova.compute.claims [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 742.460412] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 742.460615] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 742.599406] env[59379]: DEBUG nova.network.neutron [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Updating instance_info_cache with network_info: [{"id": "457011f1-233f-4316-bfbf-dbda2457934a", "address": "fa:16:3e:87:2a:79", "network": {"id": "e77bec27-f327-4043-8c16-83dbbaa9de90", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1165798798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "171afaa5f3e84fce99d714d965673aab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "72781990-3cb3-42eb-9eb1-4040dedbf66f", "external-id": "cl2-zone-812", "segmentation_id": 812, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap457011f1-23", "ovs_interfaceid": "457011f1-233f-4316-bfbf-dbda2457934a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 742.622015] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Releasing lock "refresh_cache-238825ed-3715-444c-be7c-f42f3884df7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 742.622015] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Instance network_info: |[{"id": "457011f1-233f-4316-bfbf-dbda2457934a", "address": "fa:16:3e:87:2a:79", "network": {"id": "e77bec27-f327-4043-8c16-83dbbaa9de90", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1165798798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "171afaa5f3e84fce99d714d965673aab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "72781990-3cb3-42eb-9eb1-4040dedbf66f", "external-id": "cl2-zone-812", "segmentation_id": 812, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap457011f1-23", "ovs_interfaceid": "457011f1-233f-4316-bfbf-dbda2457934a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 742.622428] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:87:2a:79', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '72781990-3cb3-42eb-9eb1-4040dedbf66f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '457011f1-233f-4316-bfbf-dbda2457934a', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 742.629246] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Creating folder: Project (171afaa5f3e84fce99d714d965673aab). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 742.632427] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eb30310b-c4d7-46d7-bd40-239f2760e533 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.647017] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Created folder: Project (171afaa5f3e84fce99d714d965673aab) in parent group-v140509. [ 742.647017] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Creating folder: Instances. Parent ref: group-v140554. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 742.647017] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-130c736a-42b9-4cc2-a846-94aced06ca71 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.658023] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Created folder: Instances in parent group-v140554. [ 742.658023] env[59379]: DEBUG oslo.service.loopingcall [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 742.658023] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 742.658023] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c967d282-8930-4cdc-b4cf-e08d3e555aaa {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.679699] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 742.679699] env[59379]: value = "task-559575" [ 742.679699] env[59379]: _type = "Task" [ 742.679699] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 742.689784] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559575, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 742.861682] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca75420d-a322-45cf-bf90-1f823dae4c8b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.870215] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c37ff5eb-7336-4b1a-8aff-66d99ab17fcb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.903227] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f33bbfa7-4beb-4217-9f47-7d56dd6d6702 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.911250] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b030d404-a9ea-40a2-8b69-874b1098d407 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 742.925056] env[59379]: DEBUG nova.compute.provider_tree [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 742.933443] env[59379]: DEBUG nova.scheduler.client.report [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 742.951464] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.491s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 742.951952] env[59379]: ERROR nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 742.951952] env[59379]: Faults: ['InvalidArgument'] [ 742.951952] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Traceback (most recent call last): [ 742.951952] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 742.951952] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] self.driver.spawn(context, instance, image_meta, [ 742.951952] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 742.951952] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] self._vmops.spawn(context, instance, image_meta, injected_files, [ 742.951952] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 742.951952] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] self._fetch_image_if_missing(context, vi) [ 742.951952] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 742.951952] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] image_cache(vi, tmp_image_ds_loc) [ 742.951952] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] vm_util.copy_virtual_disk( [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] session._wait_for_task(vmdk_copy_task) [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] return self.wait_for_task(task_ref) [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] return evt.wait() [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] result = hub.switch() [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] return self.greenlet.switch() [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 742.952338] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] self.f(*self.args, **self.kw) [ 742.952691] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 742.952691] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] raise exceptions.translate_fault(task_info.error) [ 742.952691] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 742.952691] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Faults: ['InvalidArgument'] [ 742.952691] env[59379]: ERROR nova.compute.manager [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] [ 742.952838] env[59379]: DEBUG nova.compute.utils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 742.954294] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Build of instance 14e395c0-3650-40d6-82f1-1bd8f0b29984 was re-scheduled: A specified parameter was not correct: fileType [ 742.954294] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 742.954671] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 742.954855] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 742.955053] env[59379]: DEBUG nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 742.955221] env[59379]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 743.190019] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559575, 'name': CreateVM_Task, 'duration_secs': 0.318699} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 743.191134] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 743.191790] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 743.191942] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 743.192453] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 743.192542] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e45981d4-1a20-47ea-bfe1-584dfd5e0634 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.196833] env[59379]: DEBUG oslo_vmware.api [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Waiting for the task: (returnval){ [ 743.196833] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52178916-6bd6-3c70-407b-d1a52ee27df9" [ 743.196833] env[59379]: _type = "Task" [ 743.196833] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 743.203914] env[59379]: DEBUG oslo_vmware.api [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52178916-6bd6-3c70-407b-d1a52ee27df9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 743.392054] env[59379]: DEBUG nova.network.neutron [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 743.408173] env[59379]: INFO nova.compute.manager [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] [instance: 14e395c0-3650-40d6-82f1-1bd8f0b29984] Took 0.45 seconds to deallocate network for instance. [ 743.500530] env[59379]: INFO nova.scheduler.client.report [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Deleted allocations for instance 14e395c0-3650-40d6-82f1-1bd8f0b29984 [ 743.526599] env[59379]: DEBUG oslo_concurrency.lockutils [None req-21bb3429-6173-4677-9a8c-712e7483772d tempest-MigrationsAdminTest-1277263470 tempest-MigrationsAdminTest-1277263470-project-member] Lock "14e395c0-3650-40d6-82f1-1bd8f0b29984" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 139.494s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 743.543638] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 743.591730] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 743.591969] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 743.593540] env[59379]: INFO nova.compute.claims [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 743.714949] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 743.715381] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 743.715729] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 743.956937] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c27ba4f-e17c-4029-ae71-fcab24d57787 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.965902] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de16bcb6-0c15-4f2c-8d7d-0c338bf7f042 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 743.997089] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa5721ef-8247-4c37-90b9-cc56aff6b4bd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.004861] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b33cfec-7665-428e-a654-292dd2e093e4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.018634] env[59379]: DEBUG nova.compute.provider_tree [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 744.027233] env[59379]: DEBUG nova.scheduler.client.report [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 744.040136] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.448s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 744.040649] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 744.069950] env[59379]: DEBUG nova.compute.utils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 744.071751] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 744.072119] env[59379]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 744.081630] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 744.173676] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 744.194916] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 744.195182] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 744.195342] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 744.195520] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 744.195658] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 744.195794] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 744.195992] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 744.196160] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 744.196321] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 744.196479] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 744.196672] env[59379]: DEBUG nova.virt.hardware [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 744.197528] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8015d2be-b490-4693-849e-3c1e85c1558a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.205614] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee9c8962-1801-4ed9-bd00-3f6cee34e210 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.344110] env[59379]: DEBUG nova.policy [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c75afbe4a97240c49aaed07125b26a5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b2f7e4ed23c24eaf9e9b0300e9b8b2bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 744.503253] env[59379]: DEBUG nova.compute.manager [req-bc8df97d-8bd9-4770-b79e-f5298dcc42bd req-29de4163-cb71-4f95-b69f-5a7426e50f6c service nova] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Received event network-changed-457011f1-233f-4316-bfbf-dbda2457934a {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 744.503528] env[59379]: DEBUG nova.compute.manager [req-bc8df97d-8bd9-4770-b79e-f5298dcc42bd req-29de4163-cb71-4f95-b69f-5a7426e50f6c service nova] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Refreshing instance network info cache due to event network-changed-457011f1-233f-4316-bfbf-dbda2457934a. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 744.503632] env[59379]: DEBUG oslo_concurrency.lockutils [req-bc8df97d-8bd9-4770-b79e-f5298dcc42bd req-29de4163-cb71-4f95-b69f-5a7426e50f6c service nova] Acquiring lock "refresh_cache-238825ed-3715-444c-be7c-f42f3884df7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 744.503762] env[59379]: DEBUG oslo_concurrency.lockutils [req-bc8df97d-8bd9-4770-b79e-f5298dcc42bd req-29de4163-cb71-4f95-b69f-5a7426e50f6c service nova] Acquired lock "refresh_cache-238825ed-3715-444c-be7c-f42f3884df7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 744.503911] env[59379]: DEBUG nova.network.neutron [req-bc8df97d-8bd9-4770-b79e-f5298dcc42bd req-29de4163-cb71-4f95-b69f-5a7426e50f6c service nova] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Refreshing network info cache for port 457011f1-233f-4316-bfbf-dbda2457934a {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 745.032229] env[59379]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Successfully created port: 21089f3a-3b08-442c-bea7-cebbbcd759fa {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 745.076609] env[59379]: DEBUG nova.network.neutron [req-bc8df97d-8bd9-4770-b79e-f5298dcc42bd req-29de4163-cb71-4f95-b69f-5a7426e50f6c service nova] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Updated VIF entry in instance network info cache for port 457011f1-233f-4316-bfbf-dbda2457934a. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 745.077095] env[59379]: DEBUG nova.network.neutron [req-bc8df97d-8bd9-4770-b79e-f5298dcc42bd req-29de4163-cb71-4f95-b69f-5a7426e50f6c service nova] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Updating instance_info_cache with network_info: [{"id": "457011f1-233f-4316-bfbf-dbda2457934a", "address": "fa:16:3e:87:2a:79", "network": {"id": "e77bec27-f327-4043-8c16-83dbbaa9de90", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1165798798-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "171afaa5f3e84fce99d714d965673aab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "72781990-3cb3-42eb-9eb1-4040dedbf66f", "external-id": "cl2-zone-812", "segmentation_id": 812, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap457011f1-23", "ovs_interfaceid": "457011f1-233f-4316-bfbf-dbda2457934a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 745.092417] env[59379]: DEBUG oslo_concurrency.lockutils [req-bc8df97d-8bd9-4770-b79e-f5298dcc42bd req-29de4163-cb71-4f95-b69f-5a7426e50f6c service nova] Releasing lock "refresh_cache-238825ed-3715-444c-be7c-f42f3884df7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 746.068469] env[59379]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Successfully updated port: 21089f3a-3b08-442c-bea7-cebbbcd759fa {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 746.089552] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "refresh_cache-2342e3da-6d68-466a-9140-ced4eeda73d7" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 746.089552] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquired lock "refresh_cache-2342e3da-6d68-466a-9140-ced4eeda73d7" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 746.089552] env[59379]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 746.147872] env[59379]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 746.709550] env[59379]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Updating instance_info_cache with network_info: [{"id": "21089f3a-3b08-442c-bea7-cebbbcd759fa", "address": "fa:16:3e:a3:18:64", "network": {"id": "c0caf4c9-01b4-432e-97ba-114420bd60b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-750687695-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b2f7e4ed23c24eaf9e9b0300e9b8b2bf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "571cdf48-7016-4715-8739-4cb70c90cd6d", "external-id": "nsx-vlan-transportzone-360", "segmentation_id": 360, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap21089f3a-3b", "ovs_interfaceid": "21089f3a-3b08-442c-bea7-cebbbcd759fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.722231] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Releasing lock "refresh_cache-2342e3da-6d68-466a-9140-ced4eeda73d7" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 746.722513] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Instance network_info: |[{"id": "21089f3a-3b08-442c-bea7-cebbbcd759fa", "address": "fa:16:3e:a3:18:64", "network": {"id": "c0caf4c9-01b4-432e-97ba-114420bd60b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-750687695-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b2f7e4ed23c24eaf9e9b0300e9b8b2bf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "571cdf48-7016-4715-8739-4cb70c90cd6d", "external-id": "nsx-vlan-transportzone-360", "segmentation_id": 360, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap21089f3a-3b", "ovs_interfaceid": "21089f3a-3b08-442c-bea7-cebbbcd759fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 746.722874] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a3:18:64', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '571cdf48-7016-4715-8739-4cb70c90cd6d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '21089f3a-3b08-442c-bea7-cebbbcd759fa', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 746.731838] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Creating folder: Project (b2f7e4ed23c24eaf9e9b0300e9b8b2bf). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 746.733255] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e189773d-2c89-42ae-9838-edba5122bca6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.747043] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Created folder: Project (b2f7e4ed23c24eaf9e9b0300e9b8b2bf) in parent group-v140509. [ 746.747241] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Creating folder: Instances. Parent ref: group-v140557. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 746.747466] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5036d2d0-ca1a-465a-bb80-01ce74e97a6c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.759880] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Created folder: Instances in parent group-v140557. [ 746.759880] env[59379]: DEBUG oslo.service.loopingcall [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 746.759880] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 746.759880] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-69f7b0b0-a6a8-4906-ad19-72b0d9820e12 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.779352] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 746.779352] env[59379]: value = "task-559578" [ 746.779352] env[59379]: _type = "Task" [ 746.779352] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 746.786953] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559578, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 746.902035] env[59379]: DEBUG nova.compute.manager [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Received event network-vif-plugged-21089f3a-3b08-442c-bea7-cebbbcd759fa {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 746.902035] env[59379]: DEBUG oslo_concurrency.lockutils [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] Acquiring lock "2342e3da-6d68-466a-9140-ced4eeda73d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 746.902035] env[59379]: DEBUG oslo_concurrency.lockutils [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] Lock "2342e3da-6d68-466a-9140-ced4eeda73d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 746.902035] env[59379]: DEBUG oslo_concurrency.lockutils [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] Lock "2342e3da-6d68-466a-9140-ced4eeda73d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.902592] env[59379]: DEBUG nova.compute.manager [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] No waiting events found dispatching network-vif-plugged-21089f3a-3b08-442c-bea7-cebbbcd759fa {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 746.902592] env[59379]: WARNING nova.compute.manager [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Received unexpected event network-vif-plugged-21089f3a-3b08-442c-bea7-cebbbcd759fa for instance with vm_state building and task_state spawning. [ 746.902592] env[59379]: DEBUG nova.compute.manager [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Received event network-changed-21089f3a-3b08-442c-bea7-cebbbcd759fa {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 746.902592] env[59379]: DEBUG nova.compute.manager [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Refreshing instance network info cache due to event network-changed-21089f3a-3b08-442c-bea7-cebbbcd759fa. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 746.902722] env[59379]: DEBUG oslo_concurrency.lockutils [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] Acquiring lock "refresh_cache-2342e3da-6d68-466a-9140-ced4eeda73d7" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 746.903796] env[59379]: DEBUG oslo_concurrency.lockutils [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] Acquired lock "refresh_cache-2342e3da-6d68-466a-9140-ced4eeda73d7" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 746.903796] env[59379]: DEBUG nova.network.neutron [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Refreshing network info cache for port 21089f3a-3b08-442c-bea7-cebbbcd759fa {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 747.290379] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559578, 'name': CreateVM_Task, 'duration_secs': 0.341664} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 747.292697] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 747.293450] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 747.293558] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 747.294779] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 747.294779] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-58cc1c53-68e6-44e5-8351-be500fcd7c36 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.299250] env[59379]: DEBUG oslo_vmware.api [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Waiting for the task: (returnval){ [ 747.299250] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]523fefeb-0ec7-193d-9729-4056ac1b5740" [ 747.299250] env[59379]: _type = "Task" [ 747.299250] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 747.309717] env[59379]: DEBUG oslo_vmware.api [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]523fefeb-0ec7-193d-9729-4056ac1b5740, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 747.390033] env[59379]: DEBUG nova.network.neutron [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Updated VIF entry in instance network info cache for port 21089f3a-3b08-442c-bea7-cebbbcd759fa. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 747.390033] env[59379]: DEBUG nova.network.neutron [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Updating instance_info_cache with network_info: [{"id": "21089f3a-3b08-442c-bea7-cebbbcd759fa", "address": "fa:16:3e:a3:18:64", "network": {"id": "c0caf4c9-01b4-432e-97ba-114420bd60b6", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-750687695-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b2f7e4ed23c24eaf9e9b0300e9b8b2bf", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "571cdf48-7016-4715-8739-4cb70c90cd6d", "external-id": "nsx-vlan-transportzone-360", "segmentation_id": 360, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap21089f3a-3b", "ovs_interfaceid": "21089f3a-3b08-442c-bea7-cebbbcd759fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 747.398277] env[59379]: DEBUG oslo_concurrency.lockutils [req-90940e0c-edbe-415b-b4d2-5d3a3f024ed5 req-6ee5b450-120d-4c76-b30a-2a90675210a5 service nova] Releasing lock "refresh_cache-2342e3da-6d68-466a-9140-ced4eeda73d7" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 747.811476] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 747.811785] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 747.812519] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 747.812690] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 747.812872] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 747.813157] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-13d6cc09-c834-46dc-bfa1-6b5634ed2f0e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.825259] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 747.825486] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 747.826291] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bdb343e8-251e-4d3b-b2c2-bdfa2a5f3918 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.832110] env[59379]: DEBUG oslo_vmware.api [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Waiting for the task: (returnval){ [ 747.832110] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52d8946c-5f07-20c8-7b98-2024c3c02ba7" [ 747.832110] env[59379]: _type = "Task" [ 747.832110] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 747.851480] env[59379]: DEBUG oslo_vmware.api [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52d8946c-5f07-20c8-7b98-2024c3c02ba7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 748.343821] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 748.344146] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Creating directory with path [datastore1] vmware_temp/18dedd0a-5964-4f66-b30e-e3f2660f3fdb/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 748.344183] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b0a943fb-547f-49fe-9552-407d1e92b48c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.365201] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Created directory with path [datastore1] vmware_temp/18dedd0a-5964-4f66-b30e-e3f2660f3fdb/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 748.365421] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Fetch image to [datastore1] vmware_temp/18dedd0a-5964-4f66-b30e-e3f2660f3fdb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 748.365586] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/18dedd0a-5964-4f66-b30e-e3f2660f3fdb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 748.366439] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1cf3402-4c08-436e-9fa4-834b141ae3e4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.375010] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f317ebca-7a61-47a7-a022-f41119fc2f4e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.385147] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af9d26dd-2699-4960-8cbc-0a821fa72c3b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.418642] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35416caa-17cf-45a6-8cef-c04acc81c9ee {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.425288] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0c57fdc4-756e-49dc-bcdc-a3146eaa532f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.448787] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 748.510370] env[59379]: DEBUG oslo_vmware.rw_handles [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/18dedd0a-5964-4f66-b30e-e3f2660f3fdb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 748.568476] env[59379]: DEBUG oslo_vmware.rw_handles [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 748.568645] env[59379]: DEBUG oslo_vmware.rw_handles [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/18dedd0a-5964-4f66-b30e-e3f2660f3fdb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 777.435759] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 778.428559] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 778.433229] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 780.429193] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 780.450361] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 780.451740] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 780.451740] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 780.470649] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 780.470826] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 780.470974] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 780.471115] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 780.471235] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 780.471350] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 780.471465] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 780.471577] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 780.471688] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 780.471799] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 780.471912] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Didn't find any instances for network info cache update. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 780.472369] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 781.433515] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 781.433822] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 781.433901] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 781.434047] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 781.434186] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 781.443677] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 781.443888] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 781.444063] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 781.444225] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 781.445304] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e77644d-b1bf-4b70-be36-7b71ada3ed5a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.454268] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51e202ee-aa74-401d-ab84-87d70ba504b7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.467787] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffe33e8b-5308-4c8f-9752-a2810463b092 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.473847] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc35343e-7fc4-4702-ae1d-f14ae7274b89 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.502011] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181697MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 781.502167] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 781.502351] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 781.570786] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance b9ffb5d9-8d56-4980-9e78-1e003cd56f7e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 781.570965] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 781.571109] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 294a5f91-9db2-4a43-8230-d3e6906c30f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 781.571229] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 781.571345] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 19253198-cb6e-4c48-a88b-26780f3606e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 781.571459] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 781.572037] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 71554abb-780c-4681-909f-8ff93712c82e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 781.572037] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 781.572037] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 238825ed-3715-444c-be7c-f42f3884df7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 781.572037] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2342e3da-6d68-466a-9140-ced4eeda73d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 781.582868] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 8264a1ad-cf20-404f-9d30-30c126e0c222 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.609556] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance cf939f8d-66e3-4146-8566-2c8d06d6d6da has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.622497] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance a6ff207e-a925-46d1-9aaf-e06268d3c6f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.632519] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 03742e11-0fb2-48e2-9093-77ea7b647bf3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.642983] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 05010bc2-c30a-49bf-8daa-3eec6a5e9022 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.652423] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 5df12084-5dd6-41d1-9743-747f17ce3323 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.661564] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2ed6496a-3e75-4cfd-88da-9e0b731f738a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.671892] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 49d76773-e163-440b-aa99-08c379155149 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.687674] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance dac8465a-592f-461c-af5b-49369eed5e70 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.700295] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 54605814-fdf4-43c7-9316-0d2594cdb5fa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.709661] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance f196648e-0e82-4a01-91fc-af1ba61f0490 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.720287] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 66420486-d25e-457d-94cd-6f96fca2df7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.732028] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.741263] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 13aee471-4813-4376-a7bf-70f266d9a399 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 781.741454] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 781.741597] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 782.003042] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0c7a58f-ca16-40a7-9bdd-0b0902a4ddfc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.011071] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-531767eb-f7ca-4ade-840d-3198a28daa4e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.040944] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c27d832c-973c-4b5a-a455-56ace8aaa4b3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.048215] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9946f66c-900c-4508-af8b-8694399cf976 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.061997] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 782.072177] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 782.085322] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 782.085537] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.583s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 785.579024] env[59379]: DEBUG nova.compute.manager [req-1eb3e22a-8df4-44b3-83a2-269695a799d8 req-1879333d-15b7-4eb0-b589-f6ebcd2f6720 service nova] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Received event network-vif-deleted-3a8f691e-810b-46e6-9adb-0a48e8b6d8f2 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 787.679521] env[59379]: WARNING oslo_vmware.rw_handles [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 787.679521] env[59379]: ERROR oslo_vmware.rw_handles [ 787.680381] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/1c0d83cc-80d5-44d2-a8eb-62230c793400/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 787.681832] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 787.683593] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Copying Virtual Disk [datastore2] vmware_temp/1c0d83cc-80d5-44d2-a8eb-62230c793400/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/1c0d83cc-80d5-44d2-a8eb-62230c793400/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 787.683934] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-16783ffc-a501-4a85-a905-a4a55a0d393e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.695092] env[59379]: DEBUG oslo_vmware.api [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Waiting for the task: (returnval){ [ 787.695092] env[59379]: value = "task-559579" [ 787.695092] env[59379]: _type = "Task" [ 787.695092] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 787.706086] env[59379]: DEBUG oslo_vmware.api [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Task: {'id': task-559579, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 788.205241] env[59379]: DEBUG oslo_vmware.exceptions [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 788.206027] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 788.206126] env[59379]: ERROR nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 788.206126] env[59379]: Faults: ['InvalidArgument'] [ 788.206126] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Traceback (most recent call last): [ 788.206126] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 788.206126] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] yield resources [ 788.206126] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 788.206126] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] self.driver.spawn(context, instance, image_meta, [ 788.206126] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 788.206126] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 788.206126] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 788.206126] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] self._fetch_image_if_missing(context, vi) [ 788.206126] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] image_cache(vi, tmp_image_ds_loc) [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] vm_util.copy_virtual_disk( [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] session._wait_for_task(vmdk_copy_task) [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] return self.wait_for_task(task_ref) [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] return evt.wait() [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] result = hub.switch() [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 788.206571] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] return self.greenlet.switch() [ 788.206983] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 788.206983] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] self.f(*self.args, **self.kw) [ 788.206983] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 788.206983] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] raise exceptions.translate_fault(task_info.error) [ 788.206983] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 788.206983] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Faults: ['InvalidArgument'] [ 788.206983] env[59379]: ERROR nova.compute.manager [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] [ 788.206983] env[59379]: INFO nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Terminating instance [ 788.207946] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 788.208162] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 788.208386] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-708378ea-1eec-4de6-b5f7-cb599f065cc8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.210648] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 788.210827] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 788.211574] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8575933-6293-4e86-81e6-6a644ac94778 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.218265] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 788.218458] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-65b20513-c710-41d5-9fde-572296ecf2ad {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.220582] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 788.220746] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 788.221656] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-76094795-5077-405c-b1ea-dc770a2a77f6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.226033] env[59379]: DEBUG oslo_vmware.api [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for the task: (returnval){ [ 788.226033] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52c73080-bacb-d948-d8ea-9967e6d9b878" [ 788.226033] env[59379]: _type = "Task" [ 788.226033] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 788.233046] env[59379]: DEBUG oslo_vmware.api [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52c73080-bacb-d948-d8ea-9967e6d9b878, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 788.392436] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 788.392642] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 788.392815] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Deleting the datastore file [datastore2] b9ffb5d9-8d56-4980-9e78-1e003cd56f7e {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 788.393068] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a413d909-445c-48ee-bd7e-7b3319b362f0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.400435] env[59379]: DEBUG oslo_vmware.api [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Waiting for the task: (returnval){ [ 788.400435] env[59379]: value = "task-559581" [ 788.400435] env[59379]: _type = "Task" [ 788.400435] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 788.408609] env[59379]: DEBUG oslo_vmware.api [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Task: {'id': task-559581, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 788.501201] env[59379]: DEBUG oslo_concurrency.lockutils [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "50ff2169-9c1f-4f7a-b365-1949dac57f86" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.738217] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 788.738217] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Creating directory with path [datastore2] vmware_temp/e32bab99-eae2-4ada-b9c5-c7efaff0c706/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 788.738217] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-95985b70-d677-4b1b-ae33-b090b44e2a2b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.749538] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Created directory with path [datastore2] vmware_temp/e32bab99-eae2-4ada-b9c5-c7efaff0c706/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 788.749791] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Fetch image to [datastore2] vmware_temp/e32bab99-eae2-4ada-b9c5-c7efaff0c706/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 788.749939] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/e32bab99-eae2-4ada-b9c5-c7efaff0c706/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 788.750947] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-220a48b6-904c-4b0a-bfa3-bab12798e6ab {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.757944] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19a8496f-9481-46be-876b-400b9da941e2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.772026] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-040843b4-1534-447c-a221-6b2c43305025 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.799442] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10e0f506-f590-4e8c-adaa-79a81c4352aa {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.805640] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4cb08c2c-0893-478a-bb8b-ea08edbc483a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 788.827068] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 788.886189] env[59379]: DEBUG oslo_vmware.rw_handles [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e32bab99-eae2-4ada-b9c5-c7efaff0c706/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 788.943938] env[59379]: DEBUG oslo_vmware.rw_handles [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 788.944142] env[59379]: DEBUG oslo_vmware.rw_handles [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e32bab99-eae2-4ada-b9c5-c7efaff0c706/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 788.948681] env[59379]: DEBUG oslo_vmware.api [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Task: {'id': task-559581, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080646} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 788.948932] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 788.949126] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 788.949372] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 788.949579] env[59379]: INFO nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Took 0.74 seconds to destroy the instance on the hypervisor. [ 788.953363] env[59379]: DEBUG nova.compute.claims [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 788.953363] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 788.953363] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 788.983029] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.030s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.983570] env[59379]: DEBUG nova.compute.utils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Instance b9ffb5d9-8d56-4980-9e78-1e003cd56f7e could not be found. {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 788.985037] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Instance disappeared during build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 788.985229] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 788.985416] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 788.985634] env[59379]: DEBUG nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 788.985728] env[59379]: DEBUG nova.network.neutron [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 789.113904] env[59379]: DEBUG nova.network.neutron [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 789.123794] env[59379]: INFO nova.compute.manager [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Took 0.14 seconds to deallocate network for instance. [ 789.176911] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7adad612-f524-4c15-81ed-45aff7e31009 tempest-ServersV294TestFqdnHostnames-935067380 tempest-ServersV294TestFqdnHostnames-935067380-project-member] Lock "b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.666s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 789.178061] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 195.332s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 789.178237] env[59379]: INFO nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] During sync_power_state the instance has a pending task (spawning). Skip. [ 789.178399] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "b9ffb5d9-8d56-4980-9e78-1e003cd56f7e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 789.187155] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 789.246069] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 789.246351] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 789.247793] env[59379]: INFO nova.compute.claims [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 789.561417] env[59379]: DEBUG oslo_concurrency.lockutils [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "294a5f91-9db2-4a43-8230-d3e6906c30f0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 789.675854] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0e2eb1c-0059-4e25-87bd-e86b58f9cb50 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 789.683726] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7701e9f6-d04e-4874-9f41-e0adbd5814c8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 789.721471] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfbfee58-7e70-42eb-9a88-5e2edd57ab94 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 789.730166] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5b7d9ef-0087-4fc4-805b-da62eae12319 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 789.743571] env[59379]: DEBUG nova.compute.provider_tree [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 789.753912] env[59379]: DEBUG nova.scheduler.client.report [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 789.776868] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.530s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 789.777398] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 789.818710] env[59379]: DEBUG nova.compute.utils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 789.819968] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 789.820587] env[59379]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 789.831420] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 789.914286] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 789.932021] env[59379]: DEBUG nova.policy [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6ad48be56706489fb716357da3ba96b1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '188cc585bbcd41899980076b5c302bd1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 789.939703] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 789.939972] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 789.940075] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 789.940196] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 789.940335] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 789.940473] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 789.940666] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 789.940838] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 789.941310] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 789.941524] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 789.941699] env[59379]: DEBUG nova.virt.hardware [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 789.942564] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-903de470-4d75-4af6-85c6-b8ed5b3e88e4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 789.953051] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41bf3ca0-4606-416f-ab42-582f8a19c8f4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 790.157280] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "2545ca35-7a3f-47ed-b0de-e1bb26967379" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 790.451414] env[59379]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Successfully created port: b6c6cebf-c8a8-493f-b806-711c6652a302 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 791.385973] env[59379]: DEBUG nova.compute.manager [req-3ae1cb5f-500c-4a35-8aba-222fd720249b req-488b05a0-b87d-4a2b-b111-3065f8d85635 service nova] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Received event network-vif-plugged-b6c6cebf-c8a8-493f-b806-711c6652a302 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 791.386255] env[59379]: DEBUG oslo_concurrency.lockutils [req-3ae1cb5f-500c-4a35-8aba-222fd720249b req-488b05a0-b87d-4a2b-b111-3065f8d85635 service nova] Acquiring lock "8264a1ad-cf20-404f-9d30-30c126e0c222-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 791.386377] env[59379]: DEBUG oslo_concurrency.lockutils [req-3ae1cb5f-500c-4a35-8aba-222fd720249b req-488b05a0-b87d-4a2b-b111-3065f8d85635 service nova] Lock "8264a1ad-cf20-404f-9d30-30c126e0c222-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 791.386532] env[59379]: DEBUG oslo_concurrency.lockutils [req-3ae1cb5f-500c-4a35-8aba-222fd720249b req-488b05a0-b87d-4a2b-b111-3065f8d85635 service nova] Lock "8264a1ad-cf20-404f-9d30-30c126e0c222-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 791.386733] env[59379]: DEBUG nova.compute.manager [req-3ae1cb5f-500c-4a35-8aba-222fd720249b req-488b05a0-b87d-4a2b-b111-3065f8d85635 service nova] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] No waiting events found dispatching network-vif-plugged-b6c6cebf-c8a8-493f-b806-711c6652a302 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 791.386825] env[59379]: WARNING nova.compute.manager [req-3ae1cb5f-500c-4a35-8aba-222fd720249b req-488b05a0-b87d-4a2b-b111-3065f8d85635 service nova] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Received unexpected event network-vif-plugged-b6c6cebf-c8a8-493f-b806-711c6652a302 for instance with vm_state building and task_state spawning. [ 791.433781] env[59379]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Successfully updated port: b6c6cebf-c8a8-493f-b806-711c6652a302 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 791.442872] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "refresh_cache-8264a1ad-cf20-404f-9d30-30c126e0c222" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 791.443112] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquired lock "refresh_cache-8264a1ad-cf20-404f-9d30-30c126e0c222" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 791.443190] env[59379]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 791.466077] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "19253198-cb6e-4c48-a88b-26780f3606e8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 791.504897] env[59379]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 791.999306] env[59379]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Updating instance_info_cache with network_info: [{"id": "b6c6cebf-c8a8-493f-b806-711c6652a302", "address": "fa:16:3e:1a:98:03", "network": {"id": "342c03ef-678f-444b-b97c-19bfadb7cb24", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-528346499-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "188cc585bbcd41899980076b5c302bd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "31e77685-b4dd-4810-80ef-24115ea9ea62", "external-id": "nsx-vlan-transportzone-56", "segmentation_id": 56, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb6c6cebf-c8", "ovs_interfaceid": "b6c6cebf-c8a8-493f-b806-711c6652a302", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 792.010676] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Releasing lock "refresh_cache-8264a1ad-cf20-404f-9d30-30c126e0c222" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 792.010983] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Instance network_info: |[{"id": "b6c6cebf-c8a8-493f-b806-711c6652a302", "address": "fa:16:3e:1a:98:03", "network": {"id": "342c03ef-678f-444b-b97c-19bfadb7cb24", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-528346499-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "188cc585bbcd41899980076b5c302bd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "31e77685-b4dd-4810-80ef-24115ea9ea62", "external-id": "nsx-vlan-transportzone-56", "segmentation_id": 56, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb6c6cebf-c8", "ovs_interfaceid": "b6c6cebf-c8a8-493f-b806-711c6652a302", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 792.011360] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1a:98:03', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '31e77685-b4dd-4810-80ef-24115ea9ea62', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b6c6cebf-c8a8-493f-b806-711c6652a302', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 792.018822] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Creating folder: Project (188cc585bbcd41899980076b5c302bd1). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 792.019347] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e4c7dda3-9ee3-4eda-a322-ff94119945a1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 792.030569] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Created folder: Project (188cc585bbcd41899980076b5c302bd1) in parent group-v140509. [ 792.030748] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Creating folder: Instances. Parent ref: group-v140560. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 792.030969] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f31a468a-e688-413a-8e39-c17df97b3d84 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 792.039939] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Created folder: Instances in parent group-v140560. [ 792.040187] env[59379]: DEBUG oslo.service.loopingcall [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 792.040360] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 792.040538] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3613fbb7-27a3-4144-8c69-e287d4b8f9e9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 792.071022] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 792.071022] env[59379]: value = "task-559584" [ 792.071022] env[59379]: _type = "Task" [ 792.071022] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 792.078492] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559584, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 792.537027] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "06d5ac6a-7734-46e3-80c5-d960821b7552" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 792.537317] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "06d5ac6a-7734-46e3-80c5-d960821b7552" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 792.582715] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559584, 'name': CreateVM_Task, 'duration_secs': 0.303018} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 792.582715] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 792.583304] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 792.583456] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 792.583776] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 792.584167] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-92b1dbe8-d0fc-4a9d-8090-a40f7895fb4d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 792.588530] env[59379]: DEBUG oslo_vmware.api [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Waiting for the task: (returnval){ [ 792.588530] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]523a4660-d795-1d1d-c85e-ecc68dbc4e9f" [ 792.588530] env[59379]: _type = "Task" [ 792.588530] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 792.595806] env[59379]: DEBUG oslo_vmware.api [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]523a4660-d795-1d1d-c85e-ecc68dbc4e9f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 792.812757] env[59379]: DEBUG oslo_concurrency.lockutils [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 793.099230] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 793.099462] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 793.099664] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 793.414386] env[59379]: DEBUG nova.compute.manager [req-5cf7b9ea-bb35-4b56-99f6-04211a3a06a1 req-e139e0be-2770-4b3c-afda-805da3fe1c3e service nova] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Received event network-changed-b6c6cebf-c8a8-493f-b806-711c6652a302 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 793.414590] env[59379]: DEBUG nova.compute.manager [req-5cf7b9ea-bb35-4b56-99f6-04211a3a06a1 req-e139e0be-2770-4b3c-afda-805da3fe1c3e service nova] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Refreshing instance network info cache due to event network-changed-b6c6cebf-c8a8-493f-b806-711c6652a302. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 793.415080] env[59379]: DEBUG oslo_concurrency.lockutils [req-5cf7b9ea-bb35-4b56-99f6-04211a3a06a1 req-e139e0be-2770-4b3c-afda-805da3fe1c3e service nova] Acquiring lock "refresh_cache-8264a1ad-cf20-404f-9d30-30c126e0c222" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 793.415331] env[59379]: DEBUG oslo_concurrency.lockutils [req-5cf7b9ea-bb35-4b56-99f6-04211a3a06a1 req-e139e0be-2770-4b3c-afda-805da3fe1c3e service nova] Acquired lock "refresh_cache-8264a1ad-cf20-404f-9d30-30c126e0c222" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 793.415512] env[59379]: DEBUG nova.network.neutron [req-5cf7b9ea-bb35-4b56-99f6-04211a3a06a1 req-e139e0be-2770-4b3c-afda-805da3fe1c3e service nova] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Refreshing network info cache for port b6c6cebf-c8a8-493f-b806-711c6652a302 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 793.760584] env[59379]: DEBUG nova.network.neutron [req-5cf7b9ea-bb35-4b56-99f6-04211a3a06a1 req-e139e0be-2770-4b3c-afda-805da3fe1c3e service nova] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Updated VIF entry in instance network info cache for port b6c6cebf-c8a8-493f-b806-711c6652a302. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 793.760584] env[59379]: DEBUG nova.network.neutron [req-5cf7b9ea-bb35-4b56-99f6-04211a3a06a1 req-e139e0be-2770-4b3c-afda-805da3fe1c3e service nova] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Updating instance_info_cache with network_info: [{"id": "b6c6cebf-c8a8-493f-b806-711c6652a302", "address": "fa:16:3e:1a:98:03", "network": {"id": "342c03ef-678f-444b-b97c-19bfadb7cb24", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-528346499-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "188cc585bbcd41899980076b5c302bd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "31e77685-b4dd-4810-80ef-24115ea9ea62", "external-id": "nsx-vlan-transportzone-56", "segmentation_id": 56, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb6c6cebf-c8", "ovs_interfaceid": "b6c6cebf-c8a8-493f-b806-711c6652a302", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 793.769359] env[59379]: DEBUG oslo_concurrency.lockutils [req-5cf7b9ea-bb35-4b56-99f6-04211a3a06a1 req-e139e0be-2770-4b3c-afda-805da3fe1c3e service nova] Releasing lock "refresh_cache-8264a1ad-cf20-404f-9d30-30c126e0c222" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 796.336070] env[59379]: WARNING oslo_vmware.rw_handles [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 796.336070] env[59379]: ERROR oslo_vmware.rw_handles [ 796.336070] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/18dedd0a-5964-4f66-b30e-e3f2660f3fdb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 796.336763] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 796.336763] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Copying Virtual Disk [datastore1] vmware_temp/18dedd0a-5964-4f66-b30e-e3f2660f3fdb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/18dedd0a-5964-4f66-b30e-e3f2660f3fdb/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 796.336763] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a14499f4-878f-462a-8547-52f09ff97f11 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.345411] env[59379]: DEBUG oslo_vmware.api [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Waiting for the task: (returnval){ [ 796.345411] env[59379]: value = "task-559585" [ 796.345411] env[59379]: _type = "Task" [ 796.345411] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 796.354715] env[59379]: DEBUG oslo_vmware.api [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Task: {'id': task-559585, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 796.856325] env[59379]: DEBUG oslo_vmware.exceptions [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 796.856551] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 796.857094] env[59379]: ERROR nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 796.857094] env[59379]: Faults: ['InvalidArgument'] [ 796.857094] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Traceback (most recent call last): [ 796.857094] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 796.857094] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] yield resources [ 796.857094] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 796.857094] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] self.driver.spawn(context, instance, image_meta, [ 796.857094] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 796.857094] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 796.857094] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 796.857094] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] self._fetch_image_if_missing(context, vi) [ 796.857094] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] image_cache(vi, tmp_image_ds_loc) [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] vm_util.copy_virtual_disk( [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] session._wait_for_task(vmdk_copy_task) [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] return self.wait_for_task(task_ref) [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] return evt.wait() [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] result = hub.switch() [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 796.857412] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] return self.greenlet.switch() [ 796.857731] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 796.857731] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] self.f(*self.args, **self.kw) [ 796.857731] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 796.857731] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] raise exceptions.translate_fault(task_info.error) [ 796.857731] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 796.857731] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Faults: ['InvalidArgument'] [ 796.857731] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] [ 796.857731] env[59379]: INFO nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Terminating instance [ 796.858921] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 796.859144] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 796.859387] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2a7dff10-b209-45b3-b730-b8a997958806 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.861678] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 796.861860] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 796.862636] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43aae962-f1a3-4cb4-88b4-842808bfdd8e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.869524] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 796.869741] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e9ed0e41-318d-4952-b5df-336518d4e230 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.871857] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 796.872024] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 796.872962] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b26ef5b0-722e-4238-b408-64641aa0dd8a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.877755] env[59379]: DEBUG oslo_vmware.api [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Waiting for the task: (returnval){ [ 796.877755] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52af62f8-22d3-21f5-c594-8a2216314fd8" [ 796.877755] env[59379]: _type = "Task" [ 796.877755] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 796.884771] env[59379]: DEBUG oslo_vmware.api [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52af62f8-22d3-21f5-c594-8a2216314fd8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 796.950521] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 796.950730] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 796.950906] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Deleting the datastore file [datastore1] 2342e3da-6d68-466a-9140-ced4eeda73d7 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 796.951229] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5e205886-f06c-43d1-9dc0-30ef5843e63e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.957694] env[59379]: DEBUG oslo_vmware.api [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Waiting for the task: (returnval){ [ 796.957694] env[59379]: value = "task-559587" [ 796.957694] env[59379]: _type = "Task" [ 796.957694] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 796.966656] env[59379]: DEBUG oslo_vmware.api [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Task: {'id': task-559587, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 797.389615] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 797.389953] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Creating directory with path [datastore1] vmware_temp/4164c293-c506-4562-ac7c-60dbd4964510/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 797.390114] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fb1abece-fb04-4ab5-aa74-f74838197b44 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.402280] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Created directory with path [datastore1] vmware_temp/4164c293-c506-4562-ac7c-60dbd4964510/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 797.402473] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Fetch image to [datastore1] vmware_temp/4164c293-c506-4562-ac7c-60dbd4964510/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 797.402584] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/4164c293-c506-4562-ac7c-60dbd4964510/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 797.403371] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23fd3443-291a-463d-846d-05e2cbbcf97d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.410324] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f149cda7-d4c0-47c8-b359-2932593b960e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.419502] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb1062db-9b44-43a2-96a1-2f08966773b8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.449328] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b516347c-4ffc-4ac4-9da6-2ed66932a713 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.454720] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ecfe06c5-e166-4e11-8e96-02fa8b0893dd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.465402] env[59379]: DEBUG oslo_vmware.api [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Task: {'id': task-559587, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068151} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 797.465619] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 797.465787] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 797.465955] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 797.466136] env[59379]: INFO nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 797.468220] env[59379]: DEBUG nova.compute.claims [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 797.468379] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 797.468577] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 797.475490] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 797.522736] env[59379]: DEBUG oslo_vmware.rw_handles [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4164c293-c506-4562-ac7c-60dbd4964510/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 797.577429] env[59379]: DEBUG oslo_vmware.rw_handles [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 797.577598] env[59379]: DEBUG oslo_vmware.rw_handles [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4164c293-c506-4562-ac7c-60dbd4964510/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 797.831017] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-373210d0-5128-4bf6-9081-7f05b44cec02 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.838149] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90ec053b-9cd7-428a-b689-1d875bc102b1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.872902] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85335f89-3ed2-49da-9ece-dd0b9131eebd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.880391] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8c53e32-c77c-4472-bdf5-98d2368cc409 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 797.893598] env[59379]: DEBUG nova.compute.provider_tree [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 797.905180] env[59379]: DEBUG nova.scheduler.client.report [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 797.918912] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.450s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 797.919650] env[59379]: ERROR nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.919650] env[59379]: Faults: ['InvalidArgument'] [ 797.919650] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Traceback (most recent call last): [ 797.919650] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 797.919650] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] self.driver.spawn(context, instance, image_meta, [ 797.919650] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 797.919650] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 797.919650] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 797.919650] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] self._fetch_image_if_missing(context, vi) [ 797.919650] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 797.919650] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] image_cache(vi, tmp_image_ds_loc) [ 797.919650] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] vm_util.copy_virtual_disk( [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] session._wait_for_task(vmdk_copy_task) [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] return self.wait_for_task(task_ref) [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] return evt.wait() [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] result = hub.switch() [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] return self.greenlet.switch() [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 797.920125] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] self.f(*self.args, **self.kw) [ 797.920456] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 797.920456] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] raise exceptions.translate_fault(task_info.error) [ 797.920456] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 797.920456] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Faults: ['InvalidArgument'] [ 797.920456] env[59379]: ERROR nova.compute.manager [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] [ 797.920943] env[59379]: DEBUG nova.compute.utils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 797.922399] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Build of instance 2342e3da-6d68-466a-9140-ced4eeda73d7 was re-scheduled: A specified parameter was not correct: fileType [ 797.922399] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 797.923035] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 797.923035] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 797.923184] env[59379]: DEBUG nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 797.923241] env[59379]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 798.399327] env[59379]: DEBUG nova.network.neutron [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 798.410902] env[59379]: INFO nova.compute.manager [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] [instance: 2342e3da-6d68-466a-9140-ced4eeda73d7] Took 0.49 seconds to deallocate network for instance. [ 798.502235] env[59379]: INFO nova.scheduler.client.report [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Deleted allocations for instance 2342e3da-6d68-466a-9140-ced4eeda73d7 [ 798.533823] env[59379]: DEBUG oslo_concurrency.lockutils [None req-14cc2716-f2a3-4bc8-91ae-56740912fc57 tempest-SecurityGroupsTestJSON-1449800206 tempest-SecurityGroupsTestJSON-1449800206-project-member] Lock "2342e3da-6d68-466a-9140-ced4eeda73d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 130.965s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.546359] env[59379]: DEBUG nova.compute.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 798.597042] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 798.597308] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 798.599115] env[59379]: INFO nova.compute.claims [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 798.894944] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3395ab70-d423-4c36-a437-2aa8c7c95e9d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.902905] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4192d1c0-e060-4059-ad71-91b1c0165dcc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.933239] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb2b9306-8474-44e8-b525-004b0150526d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.939989] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21cc5728-bc4f-415d-81ef-2fbebe59259a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.952543] env[59379]: DEBUG nova.compute.provider_tree [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 798.960827] env[59379]: DEBUG nova.scheduler.client.report [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 798.974328] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.377s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.974661] env[59379]: DEBUG nova.compute.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 799.006026] env[59379]: DEBUG nova.compute.utils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 799.008355] env[59379]: DEBUG nova.compute.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 799.008533] env[59379]: DEBUG nova.network.neutron [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 799.017713] env[59379]: DEBUG nova.compute.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 799.048774] env[59379]: INFO nova.virt.block_device [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Booting with volume 26f0bfa0-6fa3-442a-8115-af5c9e8839d2 at /dev/sda [ 799.087743] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b406a244-9e47-4410-966a-166be504cc2a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.098451] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2315b54-cc06-4ffa-924a-b91c87593fd4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.125228] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-546efa8d-2c4b-45f7-ade6-c772d7f9787a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.128452] env[59379]: DEBUG nova.policy [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5df5c462491a49c3a5aedc630cd0bfac', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d2413e255144034ba23edb5eac6962a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 799.135475] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dde31317-c03b-46d9-a92b-14e357102e35 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.162118] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5913feb7-910f-4fb0-a41d-24c416a0bf04 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.168211] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c358f6b3-a61b-440d-a14e-d95e38582787 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.182934] env[59379]: DEBUG nova.virt.block_device [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Updating existing volume attachment record: 16819a76-470e-4eaa-9cb4-5f5595eb70ed {{(pid=59379) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 799.257914] env[59379]: DEBUG oslo_concurrency.lockutils [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "789a3358-bc70-44a5-bb2f-4fc2f1ff9116" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 799.403181] env[59379]: DEBUG nova.compute.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 799.403181] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 799.403181] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 799.403587] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 799.403587] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 799.403969] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 799.404362] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 799.404837] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 799.405545] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 799.405545] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 799.405866] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 799.406518] env[59379]: DEBUG nova.virt.hardware [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 799.407979] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2053178d-dfdb-4732-82a9-bca6452f297c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.417424] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d666b02-cfaf-45da-aad2-4346a09a1e1b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 799.586150] env[59379]: DEBUG nova.network.neutron [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Successfully created port: 8c8ab20d-0b5a-4110-9968-2324f4f614b2 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 800.831194] env[59379]: DEBUG nova.network.neutron [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Successfully updated port: 8c8ab20d-0b5a-4110-9968-2324f4f614b2 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 800.840228] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "refresh_cache-cf939f8d-66e3-4146-8566-2c8d06d6d6da" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 800.840292] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquired lock "refresh_cache-cf939f8d-66e3-4146-8566-2c8d06d6d6da" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 800.840408] env[59379]: DEBUG nova.network.neutron [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 800.917236] env[59379]: DEBUG nova.network.neutron [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 800.933691] env[59379]: DEBUG oslo_concurrency.lockutils [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "71554abb-780c-4681-909f-8ff93712c82e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 801.068627] env[59379]: DEBUG nova.compute.manager [req-06962149-b603-4b9b-a6d2-d3f1742c1a81 req-2a2f098d-0952-4023-a3fa-9f19f5c57c76 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Received event network-vif-plugged-8c8ab20d-0b5a-4110-9968-2324f4f614b2 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 801.068993] env[59379]: DEBUG oslo_concurrency.lockutils [req-06962149-b603-4b9b-a6d2-d3f1742c1a81 req-2a2f098d-0952-4023-a3fa-9f19f5c57c76 service nova] Acquiring lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 801.069307] env[59379]: DEBUG oslo_concurrency.lockutils [req-06962149-b603-4b9b-a6d2-d3f1742c1a81 req-2a2f098d-0952-4023-a3fa-9f19f5c57c76 service nova] Lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 801.069513] env[59379]: DEBUG oslo_concurrency.lockutils [req-06962149-b603-4b9b-a6d2-d3f1742c1a81 req-2a2f098d-0952-4023-a3fa-9f19f5c57c76 service nova] Lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 801.069730] env[59379]: DEBUG nova.compute.manager [req-06962149-b603-4b9b-a6d2-d3f1742c1a81 req-2a2f098d-0952-4023-a3fa-9f19f5c57c76 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] No waiting events found dispatching network-vif-plugged-8c8ab20d-0b5a-4110-9968-2324f4f614b2 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 801.070063] env[59379]: WARNING nova.compute.manager [req-06962149-b603-4b9b-a6d2-d3f1742c1a81 req-2a2f098d-0952-4023-a3fa-9f19f5c57c76 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Received unexpected event network-vif-plugged-8c8ab20d-0b5a-4110-9968-2324f4f614b2 for instance with vm_state building and task_state spawning. [ 801.189641] env[59379]: DEBUG nova.network.neutron [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Updating instance_info_cache with network_info: [{"id": "8c8ab20d-0b5a-4110-9968-2324f4f614b2", "address": "fa:16:3e:72:dd:53", "network": {"id": "535fed8c-42ea-48bd-8c50-4573718297a5", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1683198461-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6d2413e255144034ba23edb5eac6962a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "329d0e4b-4190-484a-8560-9356dc31beca", "external-id": "nsx-vlan-transportzone-29", "segmentation_id": 29, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c8ab20d-0b", "ovs_interfaceid": "8c8ab20d-0b5a-4110-9968-2324f4f614b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 801.199613] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Releasing lock "refresh_cache-cf939f8d-66e3-4146-8566-2c8d06d6d6da" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 801.199878] env[59379]: DEBUG nova.compute.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Instance network_info: |[{"id": "8c8ab20d-0b5a-4110-9968-2324f4f614b2", "address": "fa:16:3e:72:dd:53", "network": {"id": "535fed8c-42ea-48bd-8c50-4573718297a5", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1683198461-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6d2413e255144034ba23edb5eac6962a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "329d0e4b-4190-484a-8560-9356dc31beca", "external-id": "nsx-vlan-transportzone-29", "segmentation_id": 29, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c8ab20d-0b", "ovs_interfaceid": "8c8ab20d-0b5a-4110-9968-2324f4f614b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 801.200436] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:72:dd:53', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '329d0e4b-4190-484a-8560-9356dc31beca', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8c8ab20d-0b5a-4110-9968-2324f4f614b2', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 801.208411] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Creating folder: Project (6d2413e255144034ba23edb5eac6962a). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 801.208411] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c1e65188-e5bf-49c1-a1ad-648c26fd495d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.221466] env[59379]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 801.221632] env[59379]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=59379) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 801.221922] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Folder already exists: Project (6d2413e255144034ba23edb5eac6962a). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 801.222117] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Creating folder: Instances. Parent ref: group-v140543. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 801.222361] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2f102d87-ce48-4639-a150-be4e5f57579d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.231914] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Created folder: Instances in parent group-v140543. [ 801.232411] env[59379]: DEBUG oslo.service.loopingcall [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 801.232411] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 801.232576] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-df9b8ce6-9ffb-4003-9a6f-2fa34f9b3b67 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.250679] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 801.250679] env[59379]: value = "task-559590" [ 801.250679] env[59379]: _type = "Task" [ 801.250679] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 801.258051] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559590, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 801.761348] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559590, 'name': CreateVM_Task, 'duration_secs': 0.294201} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 801.761348] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 801.761830] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-140546', 'volume_id': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'name': 'volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'cf939f8d-66e3-4146-8566-2c8d06d6d6da', 'attached_at': '', 'detached_at': '', 'volume_id': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'serial': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2'}, 'delete_on_termination': True, 'guest_format': None, 'attachment_id': '16819a76-470e-4eaa-9cb4-5f5595eb70ed', 'mount_device': '/dev/sda', 'device_type': None, 'boot_index': 0, 'disk_bus': None, 'volume_type': None}], 'swap': None} {{(pid=59379) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 801.762064] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Root volume attach. Driver type: vmdk {{(pid=59379) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 801.762859] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71c19c99-7e9e-411f-898f-eb6d8092cd75 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.770648] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9da93d53-6cc0-41ff-a739-5389bf1fc890 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.776422] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-164019e0-e607-4196-99d9-7a858f8c4a81 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.782286] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-014055d6-1d2b-45a3-b638-88186b359f1d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.789554] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for the task: (returnval){ [ 801.789554] env[59379]: value = "task-559591" [ 801.789554] env[59379]: _type = "Task" [ 801.789554] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 801.796625] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559591, 'name': RelocateVM_Task} progress is 5%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 802.305717] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559591, 'name': RelocateVM_Task, 'duration_secs': 0.026656} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 802.306185] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Volume attach. Driver type: vmdk {{(pid=59379) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 802.306507] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-140546', 'volume_id': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'name': 'volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'cf939f8d-66e3-4146-8566-2c8d06d6d6da', 'attached_at': '', 'detached_at': '', 'volume_id': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'serial': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2'} {{(pid=59379) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 802.307641] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ae7b959-f5ba-4d21-9be7-94037c2204af {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.335622] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7af01e27-2302-4d5e-adac-56b2a6650d2f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.370899] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Reconfiguring VM instance instance-00000010 to attach disk [datastore1] volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2/volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2.vmdk or device None with type thin {{(pid=59379) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 802.371238] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b86af963-da02-44e5-9205-8ad18cacfda8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.390745] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for the task: (returnval){ [ 802.390745] env[59379]: value = "task-559592" [ 802.390745] env[59379]: _type = "Task" [ 802.390745] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 802.400988] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559592, 'name': ReconfigVM_Task} progress is 6%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 802.901661] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559592, 'name': ReconfigVM_Task, 'duration_secs': 0.29816} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 802.901878] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Reconfigured VM instance instance-00000010 to attach disk [datastore1] volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2/volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2.vmdk or device None with type thin {{(pid=59379) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 802.906730] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-d1d0892d-d8dc-4313-a09f-09cdfbf87385 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.923847] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for the task: (returnval){ [ 802.923847] env[59379]: value = "task-559593" [ 802.923847] env[59379]: _type = "Task" [ 802.923847] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 802.932528] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559593, 'name': ReconfigVM_Task} progress is 6%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 803.099587] env[59379]: DEBUG nova.compute.manager [req-3eebb33d-82b9-4aa3-bb86-34346cf74d96 req-94ea6e1e-4245-4d5c-be10-a150c53a7696 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Received event network-changed-8c8ab20d-0b5a-4110-9968-2324f4f614b2 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 803.099587] env[59379]: DEBUG nova.compute.manager [req-3eebb33d-82b9-4aa3-bb86-34346cf74d96 req-94ea6e1e-4245-4d5c-be10-a150c53a7696 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Refreshing instance network info cache due to event network-changed-8c8ab20d-0b5a-4110-9968-2324f4f614b2. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 803.099587] env[59379]: DEBUG oslo_concurrency.lockutils [req-3eebb33d-82b9-4aa3-bb86-34346cf74d96 req-94ea6e1e-4245-4d5c-be10-a150c53a7696 service nova] Acquiring lock "refresh_cache-cf939f8d-66e3-4146-8566-2c8d06d6d6da" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 803.099587] env[59379]: DEBUG oslo_concurrency.lockutils [req-3eebb33d-82b9-4aa3-bb86-34346cf74d96 req-94ea6e1e-4245-4d5c-be10-a150c53a7696 service nova] Acquired lock "refresh_cache-cf939f8d-66e3-4146-8566-2c8d06d6d6da" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 803.099587] env[59379]: DEBUG nova.network.neutron [req-3eebb33d-82b9-4aa3-bb86-34346cf74d96 req-94ea6e1e-4245-4d5c-be10-a150c53a7696 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Refreshing network info cache for port 8c8ab20d-0b5a-4110-9968-2324f4f614b2 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 803.439630] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559593, 'name': ReconfigVM_Task, 'duration_secs': 0.115405} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 803.439998] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-140546', 'volume_id': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'name': 'volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'cf939f8d-66e3-4146-8566-2c8d06d6d6da', 'attached_at': '', 'detached_at': '', 'volume_id': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'serial': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2'} {{(pid=59379) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 803.442383] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-018383c4-1b80-4947-8459-6e333e9e4695 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.446922] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for the task: (returnval){ [ 803.446922] env[59379]: value = "task-559594" [ 803.446922] env[59379]: _type = "Task" [ 803.446922] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 803.455066] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559594, 'name': Rename_Task} progress is 5%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 803.707326] env[59379]: DEBUG nova.network.neutron [req-3eebb33d-82b9-4aa3-bb86-34346cf74d96 req-94ea6e1e-4245-4d5c-be10-a150c53a7696 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Updated VIF entry in instance network info cache for port 8c8ab20d-0b5a-4110-9968-2324f4f614b2. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 803.707672] env[59379]: DEBUG nova.network.neutron [req-3eebb33d-82b9-4aa3-bb86-34346cf74d96 req-94ea6e1e-4245-4d5c-be10-a150c53a7696 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Updating instance_info_cache with network_info: [{"id": "8c8ab20d-0b5a-4110-9968-2324f4f614b2", "address": "fa:16:3e:72:dd:53", "network": {"id": "535fed8c-42ea-48bd-8c50-4573718297a5", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1683198461-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6d2413e255144034ba23edb5eac6962a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "329d0e4b-4190-484a-8560-9356dc31beca", "external-id": "nsx-vlan-transportzone-29", "segmentation_id": 29, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c8ab20d-0b", "ovs_interfaceid": "8c8ab20d-0b5a-4110-9968-2324f4f614b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 803.718853] env[59379]: DEBUG oslo_concurrency.lockutils [req-3eebb33d-82b9-4aa3-bb86-34346cf74d96 req-94ea6e1e-4245-4d5c-be10-a150c53a7696 service nova] Releasing lock "refresh_cache-cf939f8d-66e3-4146-8566-2c8d06d6d6da" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 803.962943] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559594, 'name': Rename_Task, 'duration_secs': 0.12482} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 803.962943] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Powering on the VM {{(pid=59379) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 803.962943] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-2e2d8650-383e-4543-b09c-2c57806d3d4e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.970020] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for the task: (returnval){ [ 803.970020] env[59379]: value = "task-559595" [ 803.970020] env[59379]: _type = "Task" [ 803.970020] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 803.976839] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559595, 'name': PowerOnVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 804.478497] env[59379]: DEBUG oslo_vmware.api [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559595, 'name': PowerOnVM_Task, 'duration_secs': 0.478131} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 804.478774] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Powered on the VM {{(pid=59379) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 804.478998] env[59379]: INFO nova.compute.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Took 5.08 seconds to spawn the instance on the hypervisor. [ 804.479294] env[59379]: DEBUG nova.compute.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Checking state {{(pid=59379) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 804.480156] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3adc6db5-668c-4694-aaac-d5cec405113e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.530110] env[59379]: INFO nova.compute.manager [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Took 5.95 seconds to build instance. [ 804.542499] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3182a4c8-021b-42d4-9a45-4c6ff298719e tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 133.765s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 804.551893] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 804.600705] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 804.600957] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 804.602378] env[59379]: INFO nova.compute.claims [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 804.906368] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9ef6773-b3ce-4c83-9e88-9775f2eddffb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.913772] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b4262d3-6a45-484a-82a9-f47bec4de20a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.945242] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d53d7686-f1fb-4d41-9f2a-43efa3001e84 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.952556] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-daeec5bf-f061-45a6-bd8c-5e727d748a6c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.966179] env[59379]: DEBUG nova.compute.provider_tree [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 804.977046] env[59379]: DEBUG nova.scheduler.client.report [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 804.989350] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.388s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 804.989806] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 805.018960] env[59379]: DEBUG nova.compute.utils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 805.020605] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 805.020773] env[59379]: DEBUG nova.network.neutron [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 805.028412] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 805.088159] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 805.114256] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 805.114482] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 805.114629] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 805.114801] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 805.114960] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 805.115239] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 805.115449] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 805.115653] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 805.115756] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 805.115921] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 805.116079] env[59379]: DEBUG nova.virt.hardware [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 805.116910] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d460b7f-ceb3-4633-a497-ae464708b2db {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.124981] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2073579-9b96-414d-921f-0f0208ab9962 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.140154] env[59379]: DEBUG nova.policy [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ab5cca9aff134861993fc7050f446c23', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cba7f3dcabc846a1b0b233e2a84f1a9a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 805.664182] env[59379]: DEBUG nova.network.neutron [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Successfully created port: b6c227db-1781-458a-8f43-a3a885499260 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 806.227117] env[59379]: DEBUG nova.compute.manager [req-1017947e-7b18-4592-8ace-24c5fdf2d0b5 req-6372d0c9-fa61-48b7-999c-b17541cc8b25 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Received event network-changed-8c8ab20d-0b5a-4110-9968-2324f4f614b2 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 806.227318] env[59379]: DEBUG nova.compute.manager [req-1017947e-7b18-4592-8ace-24c5fdf2d0b5 req-6372d0c9-fa61-48b7-999c-b17541cc8b25 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Refreshing instance network info cache due to event network-changed-8c8ab20d-0b5a-4110-9968-2324f4f614b2. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 806.227530] env[59379]: DEBUG oslo_concurrency.lockutils [req-1017947e-7b18-4592-8ace-24c5fdf2d0b5 req-6372d0c9-fa61-48b7-999c-b17541cc8b25 service nova] Acquiring lock "refresh_cache-cf939f8d-66e3-4146-8566-2c8d06d6d6da" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 806.227663] env[59379]: DEBUG oslo_concurrency.lockutils [req-1017947e-7b18-4592-8ace-24c5fdf2d0b5 req-6372d0c9-fa61-48b7-999c-b17541cc8b25 service nova] Acquired lock "refresh_cache-cf939f8d-66e3-4146-8566-2c8d06d6d6da" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 806.228018] env[59379]: DEBUG nova.network.neutron [req-1017947e-7b18-4592-8ace-24c5fdf2d0b5 req-6372d0c9-fa61-48b7-999c-b17541cc8b25 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Refreshing network info cache for port 8c8ab20d-0b5a-4110-9968-2324f4f614b2 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 807.088814] env[59379]: DEBUG nova.network.neutron [req-1017947e-7b18-4592-8ace-24c5fdf2d0b5 req-6372d0c9-fa61-48b7-999c-b17541cc8b25 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Updated VIF entry in instance network info cache for port 8c8ab20d-0b5a-4110-9968-2324f4f614b2. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 807.089291] env[59379]: DEBUG nova.network.neutron [req-1017947e-7b18-4592-8ace-24c5fdf2d0b5 req-6372d0c9-fa61-48b7-999c-b17541cc8b25 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Updating instance_info_cache with network_info: [{"id": "8c8ab20d-0b5a-4110-9968-2324f4f614b2", "address": "fa:16:3e:72:dd:53", "network": {"id": "535fed8c-42ea-48bd-8c50-4573718297a5", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1683198461-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.155", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6d2413e255144034ba23edb5eac6962a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "329d0e4b-4190-484a-8560-9356dc31beca", "external-id": "nsx-vlan-transportzone-29", "segmentation_id": 29, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8c8ab20d-0b", "ovs_interfaceid": "8c8ab20d-0b5a-4110-9968-2324f4f614b2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 807.103937] env[59379]: DEBUG oslo_concurrency.lockutils [req-1017947e-7b18-4592-8ace-24c5fdf2d0b5 req-6372d0c9-fa61-48b7-999c-b17541cc8b25 service nova] Releasing lock "refresh_cache-cf939f8d-66e3-4146-8566-2c8d06d6d6da" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 807.149271] env[59379]: DEBUG nova.network.neutron [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Successfully updated port: b6c227db-1781-458a-8f43-a3a885499260 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 807.160443] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "refresh_cache-a6ff207e-a925-46d1-9aaf-e06268d3c6f2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 807.160522] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquired lock "refresh_cache-a6ff207e-a925-46d1-9aaf-e06268d3c6f2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 807.160711] env[59379]: DEBUG nova.network.neutron [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 807.238068] env[59379]: DEBUG nova.network.neutron [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 807.376635] env[59379]: DEBUG nova.compute.manager [req-f4e9a0a4-e7b9-4d38-b777-b3ddd84d25f7 req-0d31f645-3bac-4731-bc06-b03f67139626 service nova] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Received event network-vif-plugged-b6c227db-1781-458a-8f43-a3a885499260 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 807.376837] env[59379]: DEBUG oslo_concurrency.lockutils [req-f4e9a0a4-e7b9-4d38-b777-b3ddd84d25f7 req-0d31f645-3bac-4731-bc06-b03f67139626 service nova] Acquiring lock "a6ff207e-a925-46d1-9aaf-e06268d3c6f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 807.377043] env[59379]: DEBUG oslo_concurrency.lockutils [req-f4e9a0a4-e7b9-4d38-b777-b3ddd84d25f7 req-0d31f645-3bac-4731-bc06-b03f67139626 service nova] Lock "a6ff207e-a925-46d1-9aaf-e06268d3c6f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 807.377236] env[59379]: DEBUG oslo_concurrency.lockutils [req-f4e9a0a4-e7b9-4d38-b777-b3ddd84d25f7 req-0d31f645-3bac-4731-bc06-b03f67139626 service nova] Lock "a6ff207e-a925-46d1-9aaf-e06268d3c6f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 807.377392] env[59379]: DEBUG nova.compute.manager [req-f4e9a0a4-e7b9-4d38-b777-b3ddd84d25f7 req-0d31f645-3bac-4731-bc06-b03f67139626 service nova] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] No waiting events found dispatching network-vif-plugged-b6c227db-1781-458a-8f43-a3a885499260 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 807.377544] env[59379]: WARNING nova.compute.manager [req-f4e9a0a4-e7b9-4d38-b777-b3ddd84d25f7 req-0d31f645-3bac-4731-bc06-b03f67139626 service nova] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Received unexpected event network-vif-plugged-b6c227db-1781-458a-8f43-a3a885499260 for instance with vm_state building and task_state spawning. [ 807.528802] env[59379]: DEBUG nova.network.neutron [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Updating instance_info_cache with network_info: [{"id": "b6c227db-1781-458a-8f43-a3a885499260", "address": "fa:16:3e:e8:c6:28", "network": {"id": "703edf1e-6004-46d2-a45f-af407a936cf2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1523945142-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cba7f3dcabc846a1b0b233e2a84f1a9a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb6c227db-17", "ovs_interfaceid": "b6c227db-1781-458a-8f43-a3a885499260", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 807.542466] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Releasing lock "refresh_cache-a6ff207e-a925-46d1-9aaf-e06268d3c6f2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 807.542738] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Instance network_info: |[{"id": "b6c227db-1781-458a-8f43-a3a885499260", "address": "fa:16:3e:e8:c6:28", "network": {"id": "703edf1e-6004-46d2-a45f-af407a936cf2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1523945142-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cba7f3dcabc846a1b0b233e2a84f1a9a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb6c227db-17", "ovs_interfaceid": "b6c227db-1781-458a-8f43-a3a885499260", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 807.543385] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e8:c6:28', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f267bcdd-0daa-4337-9709-5fc060c267d8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b6c227db-1781-458a-8f43-a3a885499260', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 807.553054] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Creating folder: Project (cba7f3dcabc846a1b0b233e2a84f1a9a). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 807.553054] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ff04fc66-176d-4d54-8682-1958c6bc2091 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.563842] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Created folder: Project (cba7f3dcabc846a1b0b233e2a84f1a9a) in parent group-v140509. [ 807.564033] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Creating folder: Instances. Parent ref: group-v140565. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 807.564247] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f127f380-c1e9-4cd3-a007-c1ccab68acd8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.575258] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Created folder: Instances in parent group-v140565. [ 807.575478] env[59379]: DEBUG oslo.service.loopingcall [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 807.575648] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 807.575876] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-99d3f8d9-713b-4339-bb2e-62f8d7bf31b9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.596519] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 807.596519] env[59379]: value = "task-559598" [ 807.596519] env[59379]: _type = "Task" [ 807.596519] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 807.604044] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559598, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 808.109561] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559598, 'name': CreateVM_Task} progress is 99%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 808.606610] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559598, 'name': CreateVM_Task} progress is 99%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 809.107068] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559598, 'name': CreateVM_Task} progress is 99%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 809.494827] env[59379]: DEBUG nova.compute.manager [req-870ec961-e2a5-4865-9c5d-4b6cc5e74610 req-085f2902-1ab4-4aad-b92a-30e86b595a7b service nova] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Received event network-changed-b6c227db-1781-458a-8f43-a3a885499260 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 809.495113] env[59379]: DEBUG nova.compute.manager [req-870ec961-e2a5-4865-9c5d-4b6cc5e74610 req-085f2902-1ab4-4aad-b92a-30e86b595a7b service nova] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Refreshing instance network info cache due to event network-changed-b6c227db-1781-458a-8f43-a3a885499260. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 809.495227] env[59379]: DEBUG oslo_concurrency.lockutils [req-870ec961-e2a5-4865-9c5d-4b6cc5e74610 req-085f2902-1ab4-4aad-b92a-30e86b595a7b service nova] Acquiring lock "refresh_cache-a6ff207e-a925-46d1-9aaf-e06268d3c6f2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 809.495363] env[59379]: DEBUG oslo_concurrency.lockutils [req-870ec961-e2a5-4865-9c5d-4b6cc5e74610 req-085f2902-1ab4-4aad-b92a-30e86b595a7b service nova] Acquired lock "refresh_cache-a6ff207e-a925-46d1-9aaf-e06268d3c6f2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 809.495511] env[59379]: DEBUG nova.network.neutron [req-870ec961-e2a5-4865-9c5d-4b6cc5e74610 req-085f2902-1ab4-4aad-b92a-30e86b595a7b service nova] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Refreshing network info cache for port b6c227db-1781-458a-8f43-a3a885499260 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 809.608217] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559598, 'name': CreateVM_Task} progress is 99%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 809.787023] env[59379]: DEBUG nova.network.neutron [req-870ec961-e2a5-4865-9c5d-4b6cc5e74610 req-085f2902-1ab4-4aad-b92a-30e86b595a7b service nova] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Updated VIF entry in instance network info cache for port b6c227db-1781-458a-8f43-a3a885499260. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 809.787023] env[59379]: DEBUG nova.network.neutron [req-870ec961-e2a5-4865-9c5d-4b6cc5e74610 req-085f2902-1ab4-4aad-b92a-30e86b595a7b service nova] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Updating instance_info_cache with network_info: [{"id": "b6c227db-1781-458a-8f43-a3a885499260", "address": "fa:16:3e:e8:c6:28", "network": {"id": "703edf1e-6004-46d2-a45f-af407a936cf2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1523945142-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cba7f3dcabc846a1b0b233e2a84f1a9a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f267bcdd-0daa-4337-9709-5fc060c267d8", "external-id": "nsx-vlan-transportzone-308", "segmentation_id": 308, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb6c227db-17", "ovs_interfaceid": "b6c227db-1781-458a-8f43-a3a885499260", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 809.797484] env[59379]: DEBUG oslo_concurrency.lockutils [req-870ec961-e2a5-4865-9c5d-4b6cc5e74610 req-085f2902-1ab4-4aad-b92a-30e86b595a7b service nova] Releasing lock "refresh_cache-a6ff207e-a925-46d1-9aaf-e06268d3c6f2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 810.108203] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559598, 'name': CreateVM_Task} progress is 99%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 810.610583] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559598, 'name': CreateVM_Task, 'duration_secs': 2.841384} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 810.610872] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 810.612025] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 810.612025] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 810.612025] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 810.612359] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e0195614-e7a1-4f56-9e13-5e35d3fb2991 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.616801] env[59379]: DEBUG oslo_vmware.api [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Waiting for the task: (returnval){ [ 810.616801] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52b2f7e0-c806-b87b-ed29-383b0d3da96f" [ 810.616801] env[59379]: _type = "Task" [ 810.616801] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 810.625269] env[59379]: DEBUG oslo_vmware.api [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52b2f7e0-c806-b87b-ed29-383b0d3da96f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 811.127379] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 811.127617] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 811.127894] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 822.388964] env[59379]: INFO nova.compute.manager [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Rebuilding instance [ 822.421525] env[59379]: DEBUG nova.objects.instance [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lazy-loading 'trusted_certs' on Instance uuid cf939f8d-66e3-4146-8566-2c8d06d6d6da {{(pid=59379) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 822.433069] env[59379]: DEBUG nova.compute.manager [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Checking state {{(pid=59379) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 822.434078] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81192026-583c-4d7e-8324-51c8a10553a5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.484707] env[59379]: DEBUG nova.objects.instance [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lazy-loading 'pci_requests' on Instance uuid cf939f8d-66e3-4146-8566-2c8d06d6d6da {{(pid=59379) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 822.493764] env[59379]: DEBUG nova.objects.instance [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lazy-loading 'pci_devices' on Instance uuid cf939f8d-66e3-4146-8566-2c8d06d6d6da {{(pid=59379) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 822.502559] env[59379]: DEBUG nova.objects.instance [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lazy-loading 'resources' on Instance uuid cf939f8d-66e3-4146-8566-2c8d06d6d6da {{(pid=59379) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 822.509112] env[59379]: DEBUG nova.objects.instance [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lazy-loading 'migration_context' on Instance uuid cf939f8d-66e3-4146-8566-2c8d06d6d6da {{(pid=59379) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 822.515912] env[59379]: DEBUG nova.objects.instance [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Trying to apply a migration context that does not seem to be set for this instance {{(pid=59379) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1032}} [ 822.516333] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Powering off the VM {{(pid=59379) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 822.516594] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-827d10d9-a65e-479f-a91c-be4f278f03c0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.523793] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for the task: (returnval){ [ 822.523793] env[59379]: value = "task-559599" [ 822.523793] env[59379]: _type = "Task" [ 822.523793] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 822.531751] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559599, 'name': PowerOffVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 823.033997] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559599, 'name': PowerOffVM_Task, 'duration_secs': 0.160053} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 823.034267] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Powered off the VM {{(pid=59379) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 823.034943] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Powering off the VM {{(pid=59379) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 823.035197] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-e0c8402b-842d-494e-a54b-1b7f5f3f7a8b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.041812] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for the task: (returnval){ [ 823.041812] env[59379]: value = "task-559600" [ 823.041812] env[59379]: _type = "Task" [ 823.041812] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 823.049375] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559600, 'name': PowerOffVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 823.552850] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] VM already powered off {{(pid=59379) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 823.553289] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Volume detach. Driver type: vmdk {{(pid=59379) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 823.553289] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-140546', 'volume_id': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'name': 'volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'cf939f8d-66e3-4146-8566-2c8d06d6d6da', 'attached_at': '', 'detached_at': '', 'volume_id': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'serial': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2'} {{(pid=59379) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 823.553982] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-466f745b-6673-4e62-b54a-7f278277fa62 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.571653] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58aeed82-e74c-4ca4-adfd-b19d72df0498 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.578147] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3858ceb7-9ab5-4129-8387-1ed64501ca90 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.596218] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d56136fc-60cc-4bb6-aa7c-4c2b72ed0410 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.610476] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] The volume has not been displaced from its original location: [datastore1] volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2/volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2.vmdk. No consolidation needed. {{(pid=59379) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 823.615586] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Reconfiguring VM instance instance-00000010 to detach disk 2000 {{(pid=59379) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 823.615850] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-7932038d-0734-42e3-bb61-542fad06924c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.633784] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for the task: (returnval){ [ 823.633784] env[59379]: value = "task-559601" [ 823.633784] env[59379]: _type = "Task" [ 823.633784] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 823.641494] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559601, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 824.145044] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559601, 'name': ReconfigVM_Task, 'duration_secs': 0.179207} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 824.145044] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Reconfigured VM instance instance-00000010 to detach disk 2000 {{(pid=59379) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 824.149403] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-44832700-8694-45e1-ab05-fe0e4f346391 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.164444] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for the task: (returnval){ [ 824.164444] env[59379]: value = "task-559602" [ 824.164444] env[59379]: _type = "Task" [ 824.164444] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 824.172227] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559602, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 824.674491] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559602, 'name': ReconfigVM_Task, 'duration_secs': 0.104082} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 824.674926] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-140546', 'volume_id': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'name': 'volume-26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'cf939f8d-66e3-4146-8566-2c8d06d6d6da', 'attached_at': '', 'detached_at': '', 'volume_id': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2', 'serial': '26f0bfa0-6fa3-442a-8115-af5c9e8839d2'} {{(pid=59379) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 824.675082] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 824.675754] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5703289e-df3a-49d0-9726-9bc9e003a1e6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.682723] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 824.682920] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c41daf0c-b76c-427c-ae0a-5a3768c160fc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.754103] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 824.754324] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 824.754497] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Deleting the datastore file [datastore1] cf939f8d-66e3-4146-8566-2c8d06d6d6da {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 824.754734] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3efe68e6-2e0d-4c82-a041-5f2c613874d9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 824.761732] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for the task: (returnval){ [ 824.761732] env[59379]: value = "task-559604" [ 824.761732] env[59379]: _type = "Task" [ 824.761732] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 824.769123] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559604, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 825.273273] env[59379]: DEBUG oslo_vmware.api [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Task: {'id': task-559604, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074632} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 825.273662] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 825.273952] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 825.274253] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 825.325871] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Volume detach. Driver type: vmdk {{(pid=59379) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 825.326212] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-4081a545-3410-466d-aa6f-4eecdf703580 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.335082] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1daa28c2-b6de-44f7-86dc-8800ef1534dd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.363357] env[59379]: ERROR nova.compute.manager [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Failed to detach volume 26f0bfa0-6fa3-442a-8115-af5c9e8839d2 from /dev/sda: nova.exception.InstanceNotFound: Instance cf939f8d-66e3-4146-8566-2c8d06d6d6da could not be found. [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Traceback (most recent call last): [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self.driver.rebuild(**kwargs) [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] raise NotImplementedError() [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] NotImplementedError [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] During handling of the above exception, another exception occurred: [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Traceback (most recent call last): [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 825.363357] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self.driver.detach_volume(context, old_connection_info, [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] return self._volumeops.detach_volume(connection_info, instance) [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self._detach_volume_vmdk(connection_info, instance) [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] stable_ref.fetch_moref(session) [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] raise exception.InstanceNotFound(instance_id=self._uuid) [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] nova.exception.InstanceNotFound: Instance cf939f8d-66e3-4146-8566-2c8d06d6d6da could not be found. [ 825.363727] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] [ 825.491031] env[59379]: DEBUG nova.compute.utils [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Build of instance cf939f8d-66e3-4146-8566-2c8d06d6d6da aborted: Failed to rebuild volume backed instance. {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 825.493550] env[59379]: ERROR nova.compute.manager [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance cf939f8d-66e3-4146-8566-2c8d06d6d6da aborted: Failed to rebuild volume backed instance. [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Traceback (most recent call last): [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self.driver.rebuild(**kwargs) [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] raise NotImplementedError() [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] NotImplementedError [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] During handling of the above exception, another exception occurred: [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Traceback (most recent call last): [ 825.493550] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 3570, in _rebuild_volume_backed_instance [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self._detach_root_volume(context, instance, root_bdm) [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 3549, in _detach_root_volume [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] with excutils.save_and_reraise_exception(): [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self.force_reraise() [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] raise self.value [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self.driver.detach_volume(context, old_connection_info, [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] return self._volumeops.detach_volume(connection_info, instance) [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 825.493911] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self._detach_volume_vmdk(connection_info, instance) [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] stable_ref.fetch_moref(session) [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] raise exception.InstanceNotFound(instance_id=self._uuid) [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] nova.exception.InstanceNotFound: Instance cf939f8d-66e3-4146-8566-2c8d06d6d6da could not be found. [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] During handling of the above exception, another exception occurred: [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Traceback (most recent call last): [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 10732, in _error_out_instance_on_exception [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] yield [ 825.494249] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 3826, in rebuild_instance [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self._do_rebuild_instance_with_claim( [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 3912, in _do_rebuild_instance_with_claim [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self._do_rebuild_instance( [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 4104, in _do_rebuild_instance [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self._rebuild_default_impl(**kwargs) [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 3693, in _rebuild_default_impl [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] self._rebuild_volume_backed_instance( [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] File "/opt/stack/nova/nova/compute/manager.py", line 3585, in _rebuild_volume_backed_instance [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] raise exception.BuildAbortException( [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] nova.exception.BuildAbortException: Build of instance cf939f8d-66e3-4146-8566-2c8d06d6d6da aborted: Failed to rebuild volume backed instance. [ 825.494592] env[59379]: ERROR nova.compute.manager [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] [ 825.604176] env[59379]: DEBUG oslo_concurrency.lockutils [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 825.604456] env[59379]: DEBUG oslo_concurrency.lockutils [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 825.842862] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da21e31d-ae30-456b-bc60-8cec2671f472 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.850214] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da3af96a-0711-4705-b36d-fac213f27a19 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.879371] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40bee63a-9af8-48b9-bf87-ef888030e878 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.886365] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f166a07-8ecd-4b82-8898-f98ec24b17dc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.899475] env[59379]: DEBUG nova.compute.provider_tree [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 825.908309] env[59379]: DEBUG nova.scheduler.client.report [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 825.926887] env[59379]: DEBUG oslo_concurrency.lockutils [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.322s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 825.927097] env[59379]: INFO nova.compute.manager [None req-bcef057d-95fb-4514-b510-ebaf88c76bd7 tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Successfully reverted task state from rebuilding on failure for instance. [ 826.090900] env[59379]: DEBUG oslo_concurrency.lockutils [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 826.091277] env[59379]: DEBUG oslo_concurrency.lockutils [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 826.091516] env[59379]: DEBUG oslo_concurrency.lockutils [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 826.091702] env[59379]: DEBUG oslo_concurrency.lockutils [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 826.091863] env[59379]: DEBUG oslo_concurrency.lockutils [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 826.093681] env[59379]: INFO nova.compute.manager [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Terminating instance [ 826.095586] env[59379]: DEBUG nova.compute.manager [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 826.096062] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-70ebbb2a-6c8f-4155-9e67-3518287e9a48 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.106776] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc1efa4f-7b91-4363-8851-4d826d0e7c08 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.134250] env[59379]: WARNING nova.virt.vmwareapi.driver [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance cf939f8d-66e3-4146-8566-2c8d06d6d6da could not be found. [ 826.134434] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 826.134715] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-66da26f8-6f5c-4e0a-8ccb-e7b5865a316f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.141941] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26004772-36d1-4080-8553-4cf42c1464f2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.169226] env[59379]: WARNING nova.virt.vmwareapi.vmops [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance cf939f8d-66e3-4146-8566-2c8d06d6d6da could not be found. [ 826.169412] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 826.169581] env[59379]: INFO nova.compute.manager [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Took 0.07 seconds to destroy the instance on the hypervisor. [ 826.169807] env[59379]: DEBUG oslo.service.loopingcall [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 826.170054] env[59379]: DEBUG nova.compute.manager [-] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 826.170152] env[59379]: DEBUG nova.network.neutron [-] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 827.215900] env[59379]: DEBUG nova.network.neutron [-] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 827.226682] env[59379]: INFO nova.compute.manager [-] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Took 1.06 seconds to deallocate network for instance. [ 827.299734] env[59379]: INFO nova.compute.manager [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Took 0.07 seconds to detach 1 volumes for instance. [ 827.302038] env[59379]: DEBUG nova.compute.manager [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Deleting volume: 26f0bfa0-6fa3-442a-8115-af5c9e8839d2 {{(pid=59379) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3217}} [ 827.379902] env[59379]: DEBUG oslo_concurrency.lockutils [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.380181] env[59379]: DEBUG oslo_concurrency.lockutils [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.386023] env[59379]: DEBUG nova.objects.instance [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lazy-loading 'resources' on Instance uuid cf939f8d-66e3-4146-8566-2c8d06d6d6da {{(pid=59379) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 827.479653] env[59379]: DEBUG nova.compute.manager [req-c0b761c5-277e-4143-95f8-b8cb89b09b9a req-4fc5979d-7dbf-4de6-b996-8fd5c4dc6a71 service nova] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Received event network-vif-deleted-8c8ab20d-0b5a-4110-9968-2324f4f614b2 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 827.705667] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94026385-6928-45fe-9f63-35efed7c49ca {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.716063] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34afe2bc-a886-4de7-8764-bf6081a4181f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.749630] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2af2035-b04c-4faf-a964-b43ed4c36366 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.757579] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7df7e8b9-ea8e-4420-8a26-96629952ff10 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.772663] env[59379]: DEBUG nova.compute.provider_tree [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 827.784110] env[59379]: DEBUG nova.scheduler.client.report [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 827.799239] env[59379]: DEBUG oslo_concurrency.lockutils [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.419s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.855620] env[59379]: DEBUG oslo_concurrency.lockutils [None req-a1eb2a74-de4b-4a76-81b5-4064150a190b tempest-ServerActionsV293TestJSON-1353407144 tempest-ServerActionsV293TestJSON-1353407144-project-member] Lock "cf939f8d-66e3-4146-8566-2c8d06d6d6da" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.764s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 834.625090] env[59379]: WARNING oslo_vmware.rw_handles [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 834.625090] env[59379]: ERROR oslo_vmware.rw_handles [ 834.625710] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/e32bab99-eae2-4ada-b9c5-c7efaff0c706/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 834.627135] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 834.627376] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Copying Virtual Disk [datastore2] vmware_temp/e32bab99-eae2-4ada-b9c5-c7efaff0c706/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/e32bab99-eae2-4ada-b9c5-c7efaff0c706/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 834.627644] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d51f1268-247d-47a7-8553-454883989c76 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.635688] env[59379]: DEBUG oslo_vmware.api [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for the task: (returnval){ [ 834.635688] env[59379]: value = "task-559606" [ 834.635688] env[59379]: _type = "Task" [ 834.635688] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 834.643487] env[59379]: DEBUG oslo_vmware.api [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Task: {'id': task-559606, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 835.146729] env[59379]: DEBUG oslo_vmware.exceptions [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 835.146838] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 835.147324] env[59379]: ERROR nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 835.147324] env[59379]: Faults: ['InvalidArgument'] [ 835.147324] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Traceback (most recent call last): [ 835.147324] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 835.147324] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] yield resources [ 835.147324] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 835.147324] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] self.driver.spawn(context, instance, image_meta, [ 835.147324] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 835.147324] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 835.147324] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 835.147324] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] self._fetch_image_if_missing(context, vi) [ 835.147324] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] image_cache(vi, tmp_image_ds_loc) [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] vm_util.copy_virtual_disk( [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] session._wait_for_task(vmdk_copy_task) [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] return self.wait_for_task(task_ref) [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] return evt.wait() [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] result = hub.switch() [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 835.148611] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] return self.greenlet.switch() [ 835.149040] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 835.149040] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] self.f(*self.args, **self.kw) [ 835.149040] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 835.149040] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] raise exceptions.translate_fault(task_info.error) [ 835.149040] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 835.149040] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Faults: ['InvalidArgument'] [ 835.149040] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] [ 835.149040] env[59379]: INFO nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Terminating instance [ 835.149412] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 835.149612] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 835.150282] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 835.150464] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 835.150677] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1b66fe5c-4ca2-4aab-a163-beb1de99aa1b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.153238] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdb849fa-22d0-44c8-8bbf-e41fb45675b9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.160667] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 835.161715] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4b4227ff-68d8-45ca-990d-b7d5525db373 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.163075] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 835.163242] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 835.163885] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-94c951aa-5c58-4449-92aa-de4c47a34438 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.170845] env[59379]: DEBUG oslo_vmware.api [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Waiting for the task: (returnval){ [ 835.170845] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]528b5be9-ce4f-4f6d-212b-4c21f1dd89e6" [ 835.170845] env[59379]: _type = "Task" [ 835.170845] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 835.177481] env[59379]: DEBUG oslo_vmware.api [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]528b5be9-ce4f-4f6d-212b-4c21f1dd89e6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 835.232765] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 835.233018] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 835.233247] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Deleting the datastore file [datastore2] 294a5f91-9db2-4a43-8230-d3e6906c30f0 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 835.233535] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9ea356ea-778c-4c8b-89f2-22ecccc9d363 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.240396] env[59379]: DEBUG oslo_vmware.api [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for the task: (returnval){ [ 835.240396] env[59379]: value = "task-559608" [ 835.240396] env[59379]: _type = "Task" [ 835.240396] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 835.247941] env[59379]: DEBUG oslo_vmware.api [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Task: {'id': task-559608, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 835.681207] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 835.681559] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Creating directory with path [datastore2] vmware_temp/fab7fa57-fe63-4eb7-ba3c-a21c72742cd5/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 835.681733] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2331a95f-568e-4933-99ae-8e7e4a6e8ad1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.692725] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Created directory with path [datastore2] vmware_temp/fab7fa57-fe63-4eb7-ba3c-a21c72742cd5/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 835.692907] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Fetch image to [datastore2] vmware_temp/fab7fa57-fe63-4eb7-ba3c-a21c72742cd5/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 835.693104] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/fab7fa57-fe63-4eb7-ba3c-a21c72742cd5/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 835.693819] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54e12342-f289-43b3-a3a2-9fe2342f6a15 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.700353] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35881273-a150-496f-b5ef-7080593be4c1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.709403] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-232132de-d0ab-4031-8bd4-ea976a369840 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.740588] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b19b5e2-461e-42df-8944-934a633c15ad {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.751934] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5312cbe9-fab7-45cd-9d0e-0e1b313adb35 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.753723] env[59379]: DEBUG oslo_vmware.api [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Task: {'id': task-559608, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079075} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 835.753997] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 835.754190] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 835.754353] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 835.754518] env[59379]: INFO nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 835.759832] env[59379]: DEBUG nova.compute.claims [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 835.759995] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 835.760214] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 835.779267] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 835.827076] env[59379]: DEBUG oslo_vmware.rw_handles [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fab7fa57-fe63-4eb7-ba3c-a21c72742cd5/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 835.885066] env[59379]: DEBUG oslo_vmware.rw_handles [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 835.885066] env[59379]: DEBUG oslo_vmware.rw_handles [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fab7fa57-fe63-4eb7-ba3c-a21c72742cd5/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 836.090074] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-766d5c74-78f6-4977-9ed7-7f4890b687e7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 836.098809] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59cee600-aedd-4772-9d6f-eada935d43b1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 836.129102] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9ad6979-4d50-4c04-9186-0f22cd5695b9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 836.135519] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baa9dbc6-2f6c-4cff-bf44-82c68bf3e0f6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 836.148106] env[59379]: DEBUG nova.compute.provider_tree [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 836.156344] env[59379]: DEBUG nova.scheduler.client.report [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 836.171852] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.412s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.172414] env[59379]: ERROR nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 836.172414] env[59379]: Faults: ['InvalidArgument'] [ 836.172414] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Traceback (most recent call last): [ 836.172414] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 836.172414] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] self.driver.spawn(context, instance, image_meta, [ 836.172414] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 836.172414] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 836.172414] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 836.172414] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] self._fetch_image_if_missing(context, vi) [ 836.172414] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 836.172414] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] image_cache(vi, tmp_image_ds_loc) [ 836.172414] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] vm_util.copy_virtual_disk( [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] session._wait_for_task(vmdk_copy_task) [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] return self.wait_for_task(task_ref) [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] return evt.wait() [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] result = hub.switch() [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] return self.greenlet.switch() [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 836.172785] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] self.f(*self.args, **self.kw) [ 836.173379] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 836.173379] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] raise exceptions.translate_fault(task_info.error) [ 836.173379] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 836.173379] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Faults: ['InvalidArgument'] [ 836.173379] env[59379]: ERROR nova.compute.manager [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] [ 836.173379] env[59379]: DEBUG nova.compute.utils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 836.174427] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Build of instance 294a5f91-9db2-4a43-8230-d3e6906c30f0 was re-scheduled: A specified parameter was not correct: fileType [ 836.174427] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 836.174784] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 836.174947] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 836.175107] env[59379]: DEBUG nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 836.175261] env[59379]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 836.566225] env[59379]: DEBUG nova.network.neutron [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 836.575878] env[59379]: INFO nova.compute.manager [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Took 0.40 seconds to deallocate network for instance. [ 836.659353] env[59379]: INFO nova.scheduler.client.report [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Deleted allocations for instance 294a5f91-9db2-4a43-8230-d3e6906c30f0 [ 836.674205] env[59379]: DEBUG oslo_concurrency.lockutils [None req-99798546-1953-4093-b712-db4eb0ea5e09 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "294a5f91-9db2-4a43-8230-d3e6906c30f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 249.081s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.675310] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "294a5f91-9db2-4a43-8230-d3e6906c30f0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 242.829s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.675498] env[59379]: INFO nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] During sync_power_state the instance has a pending task (spawning). Skip. [ 836.675663] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "294a5f91-9db2-4a43-8230-d3e6906c30f0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.676226] env[59379]: DEBUG oslo_concurrency.lockutils [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "294a5f91-9db2-4a43-8230-d3e6906c30f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 47.115s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.676468] env[59379]: DEBUG oslo_concurrency.lockutils [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "294a5f91-9db2-4a43-8230-d3e6906c30f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.676673] env[59379]: DEBUG oslo_concurrency.lockutils [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "294a5f91-9db2-4a43-8230-d3e6906c30f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.676833] env[59379]: DEBUG oslo_concurrency.lockutils [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "294a5f91-9db2-4a43-8230-d3e6906c30f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 836.678666] env[59379]: INFO nova.compute.manager [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Terminating instance [ 836.680347] env[59379]: DEBUG nova.compute.manager [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 836.681032] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 836.681032] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5175d03f-bf64-4e36-9fe7-d3a633f3ad38 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 836.689959] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5d5dbad-e43b-4231-bea8-94391fa11edd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 836.701934] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 836.721087] env[59379]: WARNING nova.virt.vmwareapi.vmops [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 294a5f91-9db2-4a43-8230-d3e6906c30f0 could not be found. [ 836.721288] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 836.721453] env[59379]: INFO nova.compute.manager [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 836.721675] env[59379]: DEBUG oslo.service.loopingcall [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 836.721897] env[59379]: DEBUG nova.compute.manager [-] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 836.722019] env[59379]: DEBUG nova.network.neutron [-] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 836.744273] env[59379]: DEBUG nova.network.neutron [-] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 836.752306] env[59379]: INFO nova.compute.manager [-] [instance: 294a5f91-9db2-4a43-8230-d3e6906c30f0] Took 0.03 seconds to deallocate network for instance. [ 836.754392] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 836.754609] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 836.755973] env[59379]: INFO nova.compute.claims [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 836.852047] env[59379]: DEBUG oslo_concurrency.lockutils [None req-ea092af5-458f-4bc6-b710-670549537c7b tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "294a5f91-9db2-4a43-8230-d3e6906c30f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.176s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 837.060199] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d69235dd-5334-4fcd-b431-808263be6958 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 837.068159] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d0b04a2-1745-4920-aa1b-3fdebfa96ce1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 837.097334] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53766e36-1149-41fa-8cdf-625e740bb6aa {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 837.105268] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-233eba26-30d4-429f-b213-cac11bb148a2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 837.118687] env[59379]: DEBUG nova.compute.provider_tree [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 837.127015] env[59379]: DEBUG nova.scheduler.client.report [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 837.142976] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.388s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 837.143376] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 837.174182] env[59379]: DEBUG nova.compute.utils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 837.175888] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 837.175888] env[59379]: DEBUG nova.network.neutron [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 837.184140] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 837.230744] env[59379]: DEBUG nova.policy [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9b124d362ff241668bb97f2dbfec39c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3b401b63a9c430c97a0d087a98fe664', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 837.254988] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 837.282311] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 837.282541] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 837.282851] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 837.283855] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 837.283855] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 837.283855] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 837.283855] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 837.283855] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 837.284095] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 837.284095] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 837.284161] env[59379]: DEBUG nova.virt.hardware [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 837.285110] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2809e2fc-ce5e-4bb8-bbaf-be724ea76b3f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 837.293418] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39c50ec5-7025-432a-adca-25dd03dd50a2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 837.326596] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 837.326863] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 837.433790] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 837.433965] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Cleaning up deleted instances {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11095}} [ 837.451123] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] There are 2 instances to clean {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11104}} [ 837.451468] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: cf939f8d-66e3-4146-8566-2c8d06d6d6da] Instance has had 0 of 5 cleanup attempts {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 837.490273] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: b9ffb5d9-8d56-4980-9e78-1e003cd56f7e] Instance has had 0 of 5 cleanup attempts {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 837.530197] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 837.530507] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Cleaning up deleted instances with incomplete migration {{(pid=59379) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11133}} [ 837.539786] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 838.020657] env[59379]: DEBUG nova.network.neutron [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Successfully created port: 89166cda-7d43-4466-9de0-463cf88475f3 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 838.547323] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 839.004047] env[59379]: DEBUG nova.network.neutron [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Successfully updated port: 89166cda-7d43-4466-9de0-463cf88475f3 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 839.014060] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "refresh_cache-03742e11-0fb2-48e2-9093-77ea7b647bf3" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 839.014251] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquired lock "refresh_cache-03742e11-0fb2-48e2-9093-77ea7b647bf3" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 839.014407] env[59379]: DEBUG nova.network.neutron [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 839.071685] env[59379]: DEBUG nova.compute.manager [req-203de5a6-9df2-4a5e-8043-f67ebedbc0d0 req-57f5cdb9-8045-40b4-9b83-488002ad39d3 service nova] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Received event network-vif-plugged-89166cda-7d43-4466-9de0-463cf88475f3 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 839.071972] env[59379]: DEBUG oslo_concurrency.lockutils [req-203de5a6-9df2-4a5e-8043-f67ebedbc0d0 req-57f5cdb9-8045-40b4-9b83-488002ad39d3 service nova] Acquiring lock "03742e11-0fb2-48e2-9093-77ea7b647bf3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.072105] env[59379]: DEBUG oslo_concurrency.lockutils [req-203de5a6-9df2-4a5e-8043-f67ebedbc0d0 req-57f5cdb9-8045-40b4-9b83-488002ad39d3 service nova] Lock "03742e11-0fb2-48e2-9093-77ea7b647bf3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 839.072300] env[59379]: DEBUG oslo_concurrency.lockutils [req-203de5a6-9df2-4a5e-8043-f67ebedbc0d0 req-57f5cdb9-8045-40b4-9b83-488002ad39d3 service nova] Lock "03742e11-0fb2-48e2-9093-77ea7b647bf3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 839.072463] env[59379]: DEBUG nova.compute.manager [req-203de5a6-9df2-4a5e-8043-f67ebedbc0d0 req-57f5cdb9-8045-40b4-9b83-488002ad39d3 service nova] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] No waiting events found dispatching network-vif-plugged-89166cda-7d43-4466-9de0-463cf88475f3 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 839.072623] env[59379]: WARNING nova.compute.manager [req-203de5a6-9df2-4a5e-8043-f67ebedbc0d0 req-57f5cdb9-8045-40b4-9b83-488002ad39d3 service nova] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Received unexpected event network-vif-plugged-89166cda-7d43-4466-9de0-463cf88475f3 for instance with vm_state building and task_state spawning. [ 839.082712] env[59379]: DEBUG nova.network.neutron [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 839.408308] env[59379]: DEBUG nova.network.neutron [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Updating instance_info_cache with network_info: [{"id": "89166cda-7d43-4466-9de0-463cf88475f3", "address": "fa:16:3e:02:d1:1e", "network": {"id": "37c8bd3b-7829-4486-a247-ea40475d1566", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1178515563-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e3b401b63a9c430c97a0d087a98fe664", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap89166cda-7d", "ovs_interfaceid": "89166cda-7d43-4466-9de0-463cf88475f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 839.422846] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Releasing lock "refresh_cache-03742e11-0fb2-48e2-9093-77ea7b647bf3" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 839.423143] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Instance network_info: |[{"id": "89166cda-7d43-4466-9de0-463cf88475f3", "address": "fa:16:3e:02:d1:1e", "network": {"id": "37c8bd3b-7829-4486-a247-ea40475d1566", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1178515563-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e3b401b63a9c430c97a0d087a98fe664", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap89166cda-7d", "ovs_interfaceid": "89166cda-7d43-4466-9de0-463cf88475f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 839.423508] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:02:d1:1e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '89166cda-7d43-4466-9de0-463cf88475f3', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 839.431314] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Creating folder: Project (e3b401b63a9c430c97a0d087a98fe664). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 839.431959] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 839.432275] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f56b6c20-361c-471e-a839-1425627608ff {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 839.434593] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 839.443918] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Created folder: Project (e3b401b63a9c430c97a0d087a98fe664) in parent group-v140509. [ 839.444102] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Creating folder: Instances. Parent ref: group-v140568. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 839.444302] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a7bc6da9-0341-4315-8cf4-08fa9d3f26ad {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 839.453495] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Created folder: Instances in parent group-v140568. [ 839.453849] env[59379]: DEBUG oslo.service.loopingcall [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 839.453924] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 839.454084] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5ed9fc26-9a10-4526-8f18-8940c03b693e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 839.473688] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 839.473688] env[59379]: value = "task-559611" [ 839.473688] env[59379]: _type = "Task" [ 839.473688] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 839.484214] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559611, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 839.984176] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559611, 'name': CreateVM_Task, 'duration_secs': 0.31072} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 839.984280] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 839.984947] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 839.985121] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 839.985426] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 839.985662] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-33fee811-e968-497a-b82c-43bbe9f7a12f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 839.990655] env[59379]: DEBUG oslo_vmware.api [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Waiting for the task: (returnval){ [ 839.990655] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]5293539c-5d0c-8165-b214-c0e721a74d50" [ 839.990655] env[59379]: _type = "Task" [ 839.990655] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 839.998246] env[59379]: DEBUG oslo_vmware.api [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]5293539c-5d0c-8165-b214-c0e721a74d50, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 840.500763] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 840.501119] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 840.501202] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 841.095859] env[59379]: DEBUG nova.compute.manager [req-1c59cf97-8d25-48ed-9727-9ad759523844 req-7e01ae1b-4173-408c-a7e4-123524027848 service nova] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Received event network-changed-89166cda-7d43-4466-9de0-463cf88475f3 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 841.096051] env[59379]: DEBUG nova.compute.manager [req-1c59cf97-8d25-48ed-9727-9ad759523844 req-7e01ae1b-4173-408c-a7e4-123524027848 service nova] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Refreshing instance network info cache due to event network-changed-89166cda-7d43-4466-9de0-463cf88475f3. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 841.096265] env[59379]: DEBUG oslo_concurrency.lockutils [req-1c59cf97-8d25-48ed-9727-9ad759523844 req-7e01ae1b-4173-408c-a7e4-123524027848 service nova] Acquiring lock "refresh_cache-03742e11-0fb2-48e2-9093-77ea7b647bf3" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 841.096397] env[59379]: DEBUG oslo_concurrency.lockutils [req-1c59cf97-8d25-48ed-9727-9ad759523844 req-7e01ae1b-4173-408c-a7e4-123524027848 service nova] Acquired lock "refresh_cache-03742e11-0fb2-48e2-9093-77ea7b647bf3" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 841.096545] env[59379]: DEBUG nova.network.neutron [req-1c59cf97-8d25-48ed-9727-9ad759523844 req-7e01ae1b-4173-408c-a7e4-123524027848 service nova] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Refreshing network info cache for port 89166cda-7d43-4466-9de0-463cf88475f3 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 841.433130] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 841.433331] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 841.433454] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 841.456971] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 841.457131] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 841.457377] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 841.457532] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 841.457653] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 841.457772] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 841.457897] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 841.458026] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 841.458148] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 841.458716] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 841.458871] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Didn't find any instances for network info cache update. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 841.459349] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 841.459526] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 841.459657] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 841.523735] env[59379]: DEBUG nova.network.neutron [req-1c59cf97-8d25-48ed-9727-9ad759523844 req-7e01ae1b-4173-408c-a7e4-123524027848 service nova] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Updated VIF entry in instance network info cache for port 89166cda-7d43-4466-9de0-463cf88475f3. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 841.524102] env[59379]: DEBUG nova.network.neutron [req-1c59cf97-8d25-48ed-9727-9ad759523844 req-7e01ae1b-4173-408c-a7e4-123524027848 service nova] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Updating instance_info_cache with network_info: [{"id": "89166cda-7d43-4466-9de0-463cf88475f3", "address": "fa:16:3e:02:d1:1e", "network": {"id": "37c8bd3b-7829-4486-a247-ea40475d1566", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1178515563-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e3b401b63a9c430c97a0d087a98fe664", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap89166cda-7d", "ovs_interfaceid": "89166cda-7d43-4466-9de0-463cf88475f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 841.534103] env[59379]: DEBUG oslo_concurrency.lockutils [req-1c59cf97-8d25-48ed-9727-9ad759523844 req-7e01ae1b-4173-408c-a7e4-123524027848 service nova] Releasing lock "refresh_cache-03742e11-0fb2-48e2-9093-77ea7b647bf3" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 842.433581] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 842.442773] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 842.442979] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 842.443150] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 842.443300] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 842.444411] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b01d52a2-6804-4a4c-86d2-27d7f928ff0a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 842.453425] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92004c61-5fd9-49ab-8ebe-93f012131449 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 842.468725] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-008d8926-a438-497e-a1c6-8b98913020f1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 842.474911] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de7ba5a5-25e1-4017-8b9c-90de83f61989 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 842.505133] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181751MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 842.505266] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 842.505446] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 842.627059] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 842.627059] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 842.627059] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 19253198-cb6e-4c48-a88b-26780f3606e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 842.627059] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 842.627423] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 71554abb-780c-4681-909f-8ff93712c82e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 842.627423] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 842.627423] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 238825ed-3715-444c-be7c-f42f3884df7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 842.627423] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 8264a1ad-cf20-404f-9d30-30c126e0c222 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 842.627552] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance a6ff207e-a925-46d1-9aaf-e06268d3c6f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 842.627552] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 03742e11-0fb2-48e2-9093-77ea7b647bf3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 842.638741] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 05010bc2-c30a-49bf-8daa-3eec6a5e9022 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.648829] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 5df12084-5dd6-41d1-9743-747f17ce3323 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.658480] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2ed6496a-3e75-4cfd-88da-9e0b731f738a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.667843] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 49d76773-e163-440b-aa99-08c379155149 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.676752] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance dac8465a-592f-461c-af5b-49369eed5e70 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.685733] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 54605814-fdf4-43c7-9316-0d2594cdb5fa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.695367] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance f196648e-0e82-4a01-91fc-af1ba61f0490 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.704884] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 66420486-d25e-457d-94cd-6f96fca2df7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.735828] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.747175] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 13aee471-4813-4376-a7bf-70f266d9a399 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.757528] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 06d5ac6a-7734-46e3-80c5-d960821b7552 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.769409] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 842.769637] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 842.769780] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 842.998075] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7af55c0-4103-4c2d-bdb8-1c482302c910 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 843.005746] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c9adeaa-13d7-462e-80ee-2263d2909033 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 843.035117] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f72ad89b-83e7-468e-86d0-75cf5758fcaf {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 843.042447] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-968a7ce1-cada-49f5-9ac9-87e2803bbfdb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 843.055476] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 843.064125] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 843.077616] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 843.077846] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.572s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 844.078680] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 844.078680] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 846.676719] env[59379]: WARNING oslo_vmware.rw_handles [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 846.676719] env[59379]: ERROR oslo_vmware.rw_handles [ 846.677528] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/4164c293-c506-4562-ac7c-60dbd4964510/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 846.678687] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 846.678935] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Copying Virtual Disk [datastore1] vmware_temp/4164c293-c506-4562-ac7c-60dbd4964510/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/4164c293-c506-4562-ac7c-60dbd4964510/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 846.679226] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c105a361-f12a-4e13-aba3-170d85bf9122 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 846.688059] env[59379]: DEBUG oslo_vmware.api [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Waiting for the task: (returnval){ [ 846.688059] env[59379]: value = "task-559612" [ 846.688059] env[59379]: _type = "Task" [ 846.688059] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 846.695104] env[59379]: DEBUG oslo_vmware.api [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Task: {'id': task-559612, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 847.198017] env[59379]: DEBUG oslo_vmware.exceptions [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 847.198299] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 847.198889] env[59379]: ERROR nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 847.198889] env[59379]: Faults: ['InvalidArgument'] [ 847.198889] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Traceback (most recent call last): [ 847.198889] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 847.198889] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] yield resources [ 847.198889] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 847.198889] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] self.driver.spawn(context, instance, image_meta, [ 847.198889] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 847.198889] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] self._vmops.spawn(context, instance, image_meta, injected_files, [ 847.198889] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 847.198889] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] self._fetch_image_if_missing(context, vi) [ 847.198889] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] image_cache(vi, tmp_image_ds_loc) [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] vm_util.copy_virtual_disk( [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] session._wait_for_task(vmdk_copy_task) [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] return self.wait_for_task(task_ref) [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] return evt.wait() [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] result = hub.switch() [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 847.199328] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] return self.greenlet.switch() [ 847.199734] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 847.199734] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] self.f(*self.args, **self.kw) [ 847.199734] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 847.199734] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] raise exceptions.translate_fault(task_info.error) [ 847.199734] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 847.199734] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Faults: ['InvalidArgument'] [ 847.199734] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] [ 847.199734] env[59379]: INFO nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Terminating instance [ 847.200711] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 847.200913] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 847.201150] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c555e711-4122-4edf-af38-7149244ab821 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.203381] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 847.203566] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 847.204289] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e394f99-0b78-4cf5-b063-4e9959cbb101 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.211928] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 847.212133] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3d3bfb3f-506b-49a1-9b93-4b07e791add4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.214320] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 847.214481] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 847.215424] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-56d79b4b-ce94-48f3-9553-80bbaba09315 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.219719] env[59379]: DEBUG oslo_vmware.api [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Waiting for the task: (returnval){ [ 847.219719] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]521ec2f6-41f5-42b7-00e4-20a59e00bc39" [ 847.219719] env[59379]: _type = "Task" [ 847.219719] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 847.226565] env[59379]: DEBUG oslo_vmware.api [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]521ec2f6-41f5-42b7-00e4-20a59e00bc39, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 847.291308] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 847.291570] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 847.291796] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Deleting the datastore file [datastore1] 8264a1ad-cf20-404f-9d30-30c126e0c222 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 847.292153] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-df5290da-2d18-4e15-a3c1-20af95030ee6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.298421] env[59379]: DEBUG oslo_vmware.api [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Waiting for the task: (returnval){ [ 847.298421] env[59379]: value = "task-559614" [ 847.298421] env[59379]: _type = "Task" [ 847.298421] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 847.306221] env[59379]: DEBUG oslo_vmware.api [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Task: {'id': task-559614, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 847.730199] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 847.730631] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Creating directory with path [datastore1] vmware_temp/1d0ffc5b-d86b-4316-b501-6d50ab03f669/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 847.730700] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d90438bc-4e68-4deb-b343-aecdf813c7a9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.742464] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Created directory with path [datastore1] vmware_temp/1d0ffc5b-d86b-4316-b501-6d50ab03f669/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 847.742644] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Fetch image to [datastore1] vmware_temp/1d0ffc5b-d86b-4316-b501-6d50ab03f669/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 847.742806] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/1d0ffc5b-d86b-4316-b501-6d50ab03f669/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 847.743554] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49e26ec9-4ef7-4f8d-bcb3-f2690630bd3c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.750621] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1013f3a9-5d72-4091-ad04-f9700f647ffd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.759729] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-514399f9-2ad1-4fcb-af8a-06ee9883ed92 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.790079] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8df57336-1380-4f86-9661-5fd974527a1b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.795968] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fdc8ec82-493b-43c0-8294-f96cdb63edd0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.805994] env[59379]: DEBUG oslo_vmware.api [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Task: {'id': task-559614, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069894} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 847.806240] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 847.806408] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 847.806567] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 847.806730] env[59379]: INFO nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Took 0.60 seconds to destroy the instance on the hypervisor. [ 847.808824] env[59379]: DEBUG nova.compute.claims [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 847.808978] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 847.809196] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 847.815327] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 847.861782] env[59379]: DEBUG oslo_vmware.rw_handles [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d0ffc5b-d86b-4316-b501-6d50ab03f669/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 847.917520] env[59379]: DEBUG oslo_vmware.rw_handles [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 847.917694] env[59379]: DEBUG oslo_vmware.rw_handles [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d0ffc5b-d86b-4316-b501-6d50ab03f669/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 848.142618] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f2b1668-4743-49f1-a7f5-39e3e0c647d1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.152952] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c880b9da-af40-4c3e-bf67-78446771a760 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.181287] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ffff00b-2e30-493f-9c41-f2f6889a5493 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.188495] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbeef38a-7e67-4db3-801c-c241c0bd8f18 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.201388] env[59379]: DEBUG nova.compute.provider_tree [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 848.209922] env[59379]: DEBUG nova.scheduler.client.report [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 848.226554] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.417s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.227157] env[59379]: ERROR nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.227157] env[59379]: Faults: ['InvalidArgument'] [ 848.227157] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Traceback (most recent call last): [ 848.227157] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 848.227157] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] self.driver.spawn(context, instance, image_meta, [ 848.227157] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 848.227157] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] self._vmops.spawn(context, instance, image_meta, injected_files, [ 848.227157] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 848.227157] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] self._fetch_image_if_missing(context, vi) [ 848.227157] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 848.227157] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] image_cache(vi, tmp_image_ds_loc) [ 848.227157] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] vm_util.copy_virtual_disk( [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] session._wait_for_task(vmdk_copy_task) [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] return self.wait_for_task(task_ref) [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] return evt.wait() [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] result = hub.switch() [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] return self.greenlet.switch() [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 848.227578] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] self.f(*self.args, **self.kw) [ 848.227925] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 848.227925] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] raise exceptions.translate_fault(task_info.error) [ 848.227925] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.227925] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Faults: ['InvalidArgument'] [ 848.227925] env[59379]: ERROR nova.compute.manager [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] [ 848.227925] env[59379]: DEBUG nova.compute.utils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 848.229383] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Build of instance 8264a1ad-cf20-404f-9d30-30c126e0c222 was re-scheduled: A specified parameter was not correct: fileType [ 848.229383] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 848.229747] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 848.229959] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 848.230148] env[59379]: DEBUG nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 848.230306] env[59379]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 848.521271] env[59379]: DEBUG nova.network.neutron [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 848.531578] env[59379]: INFO nova.compute.manager [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] [instance: 8264a1ad-cf20-404f-9d30-30c126e0c222] Took 0.30 seconds to deallocate network for instance. [ 848.629435] env[59379]: INFO nova.scheduler.client.report [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Deleted allocations for instance 8264a1ad-cf20-404f-9d30-30c126e0c222 [ 848.646956] env[59379]: DEBUG oslo_concurrency.lockutils [None req-cd1bd73c-9e1c-4484-82da-09fd604651d2 tempest-ServerRescueNegativeTestJSON-824119348 tempest-ServerRescueNegativeTestJSON-824119348-project-member] Lock "8264a1ad-cf20-404f-9d30-30c126e0c222" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 179.814s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.669493] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 848.721018] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 848.721018] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 848.721018] env[59379]: INFO nova.compute.claims [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 849.015691] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd0a8ccb-4b6b-467b-8731-0bd08d3af898 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.024081] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c73a49d9-f46c-4408-a750-4bc3a4526d0f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.053052] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8d7070e-68e9-48fa-83ac-e51bb28ec6de {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.060242] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ae7e329-d70d-4f03-98a3-226b3ef24946 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.073355] env[59379]: DEBUG nova.compute.provider_tree [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 849.104635] env[59379]: DEBUG nova.scheduler.client.report [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 849.117347] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.399s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 849.117893] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 849.149845] env[59379]: DEBUG nova.compute.utils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 849.153015] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 849.153015] env[59379]: DEBUG nova.network.neutron [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 849.160269] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 849.224659] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 849.228309] env[59379]: DEBUG nova.policy [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e11af25776d40a59c3119ba4d8512fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0d64b8a4deb3494ab94a07334278cf23', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 849.249360] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 849.249623] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 849.249784] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 849.249990] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 849.250156] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 849.250297] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 849.250490] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 849.250636] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 849.250789] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 849.250981] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 849.251164] env[59379]: DEBUG nova.virt.hardware [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 849.251987] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55a8d3bf-062a-4f41-85ca-85ff2892c2ff {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.259787] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-723b436e-25a0-49d1-b85f-b086fcb2080a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.584348] env[59379]: DEBUG nova.network.neutron [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Successfully created port: ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 850.798424] env[59379]: DEBUG nova.network.neutron [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Successfully updated port: ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 850.812379] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "refresh_cache-05010bc2-c30a-49bf-8daa-3eec6a5e9022" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 850.812379] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquired lock "refresh_cache-05010bc2-c30a-49bf-8daa-3eec6a5e9022" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 850.812379] env[59379]: DEBUG nova.network.neutron [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 850.884038] env[59379]: DEBUG nova.network.neutron [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 850.991039] env[59379]: DEBUG nova.compute.manager [req-cfe34219-b5c9-4e0a-b321-860ad7ddd593 req-07e773b2-ea22-45e5-967a-334b5cd1598d service nova] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Received event network-vif-plugged-ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 850.991039] env[59379]: DEBUG oslo_concurrency.lockutils [req-cfe34219-b5c9-4e0a-b321-860ad7ddd593 req-07e773b2-ea22-45e5-967a-334b5cd1598d service nova] Acquiring lock "05010bc2-c30a-49bf-8daa-3eec6a5e9022-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 850.991039] env[59379]: DEBUG oslo_concurrency.lockutils [req-cfe34219-b5c9-4e0a-b321-860ad7ddd593 req-07e773b2-ea22-45e5-967a-334b5cd1598d service nova] Lock "05010bc2-c30a-49bf-8daa-3eec6a5e9022-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 850.991039] env[59379]: DEBUG oslo_concurrency.lockutils [req-cfe34219-b5c9-4e0a-b321-860ad7ddd593 req-07e773b2-ea22-45e5-967a-334b5cd1598d service nova] Lock "05010bc2-c30a-49bf-8daa-3eec6a5e9022-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 850.991672] env[59379]: DEBUG nova.compute.manager [req-cfe34219-b5c9-4e0a-b321-860ad7ddd593 req-07e773b2-ea22-45e5-967a-334b5cd1598d service nova] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] No waiting events found dispatching network-vif-plugged-ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 850.991974] env[59379]: WARNING nova.compute.manager [req-cfe34219-b5c9-4e0a-b321-860ad7ddd593 req-07e773b2-ea22-45e5-967a-334b5cd1598d service nova] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Received unexpected event network-vif-plugged-ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789 for instance with vm_state building and task_state spawning. [ 851.185177] env[59379]: DEBUG nova.network.neutron [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Updating instance_info_cache with network_info: [{"id": "ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789", "address": "fa:16:3e:b1:f2:85", "network": {"id": "92556f2f-cba5-48e0-b411-5dc65ba39189", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1002452343-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0d64b8a4deb3494ab94a07334278cf23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae215ba8-f7a5-4b23-a055-90316d29817f", "external-id": "nsx-vlan-transportzone-798", "segmentation_id": 798, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapea4f6a00-ed", "ovs_interfaceid": "ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 851.199328] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Releasing lock "refresh_cache-05010bc2-c30a-49bf-8daa-3eec6a5e9022" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 851.199621] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Instance network_info: |[{"id": "ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789", "address": "fa:16:3e:b1:f2:85", "network": {"id": "92556f2f-cba5-48e0-b411-5dc65ba39189", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1002452343-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0d64b8a4deb3494ab94a07334278cf23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae215ba8-f7a5-4b23-a055-90316d29817f", "external-id": "nsx-vlan-transportzone-798", "segmentation_id": 798, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapea4f6a00-ed", "ovs_interfaceid": "ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 851.200058] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b1:f2:85', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ae215ba8-f7a5-4b23-a055-90316d29817f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 851.207592] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Creating folder: Project (0d64b8a4deb3494ab94a07334278cf23). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 851.211024] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9c40c886-67cb-4881-9595-3e4c93f962f1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.219859] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Created folder: Project (0d64b8a4deb3494ab94a07334278cf23) in parent group-v140509. [ 851.220120] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Creating folder: Instances. Parent ref: group-v140571. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 851.220342] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-03542e66-a7bd-4b68-83b3-2dcf39cd78c6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.229036] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Created folder: Instances in parent group-v140571. [ 851.229252] env[59379]: DEBUG oslo.service.loopingcall [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 851.229415] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 851.229590] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1ab07d24-edcc-448b-93ea-51bcdd9de4a5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.251607] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 851.251607] env[59379]: value = "task-559617" [ 851.251607] env[59379]: _type = "Task" [ 851.251607] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 851.260875] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559617, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 851.763179] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559617, 'name': CreateVM_Task, 'duration_secs': 0.307402} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 851.763336] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 851.764067] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 851.764624] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 851.764624] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 851.764788] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3f1978b9-e31d-44be-bd27-5120689bf3cd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.769754] env[59379]: DEBUG oslo_vmware.api [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Waiting for the task: (returnval){ [ 851.769754] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]529c798f-9a4d-c46a-f7dc-a67480a63ad1" [ 851.769754] env[59379]: _type = "Task" [ 851.769754] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 851.777916] env[59379]: DEBUG oslo_vmware.api [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]529c798f-9a4d-c46a-f7dc-a67480a63ad1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 852.280164] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 852.280655] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 852.280906] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 853.087080] env[59379]: DEBUG nova.compute.manager [req-5959ccb5-dccc-4b64-a02b-30e08708b9cb req-dca0c510-0875-4b6b-bb82-62b54dea5d7f service nova] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Received event network-changed-ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 853.087272] env[59379]: DEBUG nova.compute.manager [req-5959ccb5-dccc-4b64-a02b-30e08708b9cb req-dca0c510-0875-4b6b-bb82-62b54dea5d7f service nova] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Refreshing instance network info cache due to event network-changed-ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 853.087471] env[59379]: DEBUG oslo_concurrency.lockutils [req-5959ccb5-dccc-4b64-a02b-30e08708b9cb req-dca0c510-0875-4b6b-bb82-62b54dea5d7f service nova] Acquiring lock "refresh_cache-05010bc2-c30a-49bf-8daa-3eec6a5e9022" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 853.087598] env[59379]: DEBUG oslo_concurrency.lockutils [req-5959ccb5-dccc-4b64-a02b-30e08708b9cb req-dca0c510-0875-4b6b-bb82-62b54dea5d7f service nova] Acquired lock "refresh_cache-05010bc2-c30a-49bf-8daa-3eec6a5e9022" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 853.087743] env[59379]: DEBUG nova.network.neutron [req-5959ccb5-dccc-4b64-a02b-30e08708b9cb req-dca0c510-0875-4b6b-bb82-62b54dea5d7f service nova] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Refreshing network info cache for port ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 853.685674] env[59379]: DEBUG nova.network.neutron [req-5959ccb5-dccc-4b64-a02b-30e08708b9cb req-dca0c510-0875-4b6b-bb82-62b54dea5d7f service nova] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Updated VIF entry in instance network info cache for port ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 853.686040] env[59379]: DEBUG nova.network.neutron [req-5959ccb5-dccc-4b64-a02b-30e08708b9cb req-dca0c510-0875-4b6b-bb82-62b54dea5d7f service nova] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Updating instance_info_cache with network_info: [{"id": "ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789", "address": "fa:16:3e:b1:f2:85", "network": {"id": "92556f2f-cba5-48e0-b411-5dc65ba39189", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1002452343-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0d64b8a4deb3494ab94a07334278cf23", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae215ba8-f7a5-4b23-a055-90316d29817f", "external-id": "nsx-vlan-transportzone-798", "segmentation_id": 798, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapea4f6a00-ed", "ovs_interfaceid": "ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 853.695901] env[59379]: DEBUG oslo_concurrency.lockutils [req-5959ccb5-dccc-4b64-a02b-30e08708b9cb req-dca0c510-0875-4b6b-bb82-62b54dea5d7f service nova] Releasing lock "refresh_cache-05010bc2-c30a-49bf-8daa-3eec6a5e9022" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 866.178933] env[59379]: DEBUG nova.compute.manager [req-ef06a478-3d69-438f-809b-1f0c8f730790 req-87a3d2a9-8d69-4fd7-ae9f-93341361d967 service nova] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Received event network-vif-deleted-457011f1-233f-4316-bfbf-dbda2457934a {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 869.560820] env[59379]: DEBUG nova.compute.manager [req-b895e4f9-ce31-4f44-9c99-c53c0ac88590 req-0b29ac5d-66c4-4950-bf6d-7b8af09928df service nova] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Received event network-vif-deleted-b6c227db-1781-458a-8f43-a3a885499260 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 872.113321] env[59379]: DEBUG nova.compute.manager [req-eb154f28-6f81-47f7-bf9f-85dab5f1bf8b req-2dd1f1ab-dce6-48b5-96bf-88e47311ea32 service nova] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Received event network-vif-deleted-89166cda-7d43-4466-9de0-463cf88475f3 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 875.277696] env[59379]: DEBUG nova.compute.manager [req-fcc9e3fc-fded-432c-9674-ecd3009ed104 req-807da1e8-0672-4bc2-af5d-2a4ca8415f50 service nova] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Received event network-vif-deleted-ea4f6a00-ed6c-40d0-bce5-b3cdf7b34789 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 882.711473] env[59379]: WARNING oslo_vmware.rw_handles [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 882.711473] env[59379]: ERROR oslo_vmware.rw_handles [ 882.712447] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/fab7fa57-fe63-4eb7-ba3c-a21c72742cd5/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 882.713891] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 882.718610] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Copying Virtual Disk [datastore2] vmware_temp/fab7fa57-fe63-4eb7-ba3c-a21c72742cd5/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/fab7fa57-fe63-4eb7-ba3c-a21c72742cd5/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 882.718610] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0714c33e-6c69-4a7b-a422-fa22e9f38fbb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 882.723815] env[59379]: DEBUG oslo_vmware.api [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Waiting for the task: (returnval){ [ 882.723815] env[59379]: value = "task-559618" [ 882.723815] env[59379]: _type = "Task" [ 882.723815] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 882.732927] env[59379]: DEBUG oslo_vmware.api [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Task: {'id': task-559618, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 883.238741] env[59379]: DEBUG oslo_vmware.exceptions [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 883.238990] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 883.243707] env[59379]: ERROR nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 883.243707] env[59379]: Faults: ['InvalidArgument'] [ 883.243707] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Traceback (most recent call last): [ 883.243707] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 883.243707] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] yield resources [ 883.243707] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 883.243707] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] self.driver.spawn(context, instance, image_meta, [ 883.243707] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 883.243707] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 883.243707] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 883.243707] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] self._fetch_image_if_missing(context, vi) [ 883.243707] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] image_cache(vi, tmp_image_ds_loc) [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] vm_util.copy_virtual_disk( [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] session._wait_for_task(vmdk_copy_task) [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] return self.wait_for_task(task_ref) [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] return evt.wait() [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] result = hub.switch() [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 883.244168] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] return self.greenlet.switch() [ 883.244540] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 883.244540] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] self.f(*self.args, **self.kw) [ 883.244540] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 883.244540] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] raise exceptions.translate_fault(task_info.error) [ 883.244540] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 883.244540] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Faults: ['InvalidArgument'] [ 883.244540] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] [ 883.244540] env[59379]: INFO nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Terminating instance [ 883.245689] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 883.245891] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 883.248956] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 883.248956] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 883.249243] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c0ff1163-94d5-4abb-828e-795742edfd31 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.252018] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-124b26ad-1500-4ccb-a537-c38a789f6e8b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.266375] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 883.267130] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7afb99dc-b8b9-4185-8173-24a0c2f456df {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.268692] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 883.268872] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 883.269568] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ee6dfa21-587a-46b1-80cb-25282da118f2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.275327] env[59379]: DEBUG oslo_vmware.api [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Waiting for the task: (returnval){ [ 883.275327] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]5224a56f-232d-973e-c21a-0df9485def67" [ 883.275327] env[59379]: _type = "Task" [ 883.275327] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 883.290691] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 883.291186] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Creating directory with path [datastore2] vmware_temp/a0d0e2a9-9fba-4293-ace0-a43db3d5731b/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 883.291288] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9150b06c-c0dc-4125-8fa8-ba1bb8d621a7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.315635] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Created directory with path [datastore2] vmware_temp/a0d0e2a9-9fba-4293-ace0-a43db3d5731b/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 883.316495] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Fetch image to [datastore2] vmware_temp/a0d0e2a9-9fba-4293-ace0-a43db3d5731b/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 883.316495] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/a0d0e2a9-9fba-4293-ace0-a43db3d5731b/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 883.316797] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9166ef2-4015-4080-9f4d-45041157e46a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.326209] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63a3b89f-4f57-4ce3-bd07-430017fef98f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.334652] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71173938-c34e-47d1-b53a-e061a5861d44 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.368365] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-862beebf-666c-43b0-878c-6de92facab8c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.371287] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 883.372028] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 883.372028] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Deleting the datastore file [datastore2] 19253198-cb6e-4c48-a88b-26780f3606e8 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 883.372028] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-200173a4-de12-43bc-a8d5-bb2ca6369569 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.377240] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4bf7f70a-41d7-441a-8bf6-4c2e7937470d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.379675] env[59379]: DEBUG oslo_vmware.api [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Waiting for the task: (returnval){ [ 883.379675] env[59379]: value = "task-559620" [ 883.379675] env[59379]: _type = "Task" [ 883.379675] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 883.388139] env[59379]: DEBUG oslo_vmware.api [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Task: {'id': task-559620, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 883.398699] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 883.447614] env[59379]: DEBUG oslo_vmware.rw_handles [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a0d0e2a9-9fba-4293-ace0-a43db3d5731b/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 883.504392] env[59379]: DEBUG oslo_vmware.rw_handles [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 883.504564] env[59379]: DEBUG oslo_vmware.rw_handles [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a0d0e2a9-9fba-4293-ace0-a43db3d5731b/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 883.890446] env[59379]: DEBUG oslo_vmware.api [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Task: {'id': task-559620, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069885} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 883.890732] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 883.890940] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 883.891180] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 883.891342] env[59379]: INFO nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Took 0.64 seconds to destroy the instance on the hypervisor. [ 883.896489] env[59379]: DEBUG nova.compute.claims [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 883.896661] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 883.896876] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 884.074116] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37ba3716-9c99-4abf-994f-3eb1bbf98790 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.082168] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efa09595-792d-4eb1-9e1c-390b266070da {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.121862] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1c62c2e-9b05-4b75-aff5-b3e653aed5ca {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.130809] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe8551d9-8a7b-4791-8459-6acb0a70277b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.148490] env[59379]: DEBUG nova.compute.provider_tree [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 884.165560] env[59379]: DEBUG nova.scheduler.client.report [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 884.184061] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.287s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 884.184619] env[59379]: ERROR nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 884.184619] env[59379]: Faults: ['InvalidArgument'] [ 884.184619] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Traceback (most recent call last): [ 884.184619] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 884.184619] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] self.driver.spawn(context, instance, image_meta, [ 884.184619] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 884.184619] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 884.184619] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 884.184619] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] self._fetch_image_if_missing(context, vi) [ 884.184619] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 884.184619] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] image_cache(vi, tmp_image_ds_loc) [ 884.184619] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] vm_util.copy_virtual_disk( [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] session._wait_for_task(vmdk_copy_task) [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] return self.wait_for_task(task_ref) [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] return evt.wait() [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] result = hub.switch() [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] return self.greenlet.switch() [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 884.185208] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] self.f(*self.args, **self.kw) [ 884.185790] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 884.185790] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] raise exceptions.translate_fault(task_info.error) [ 884.185790] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 884.185790] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Faults: ['InvalidArgument'] [ 884.185790] env[59379]: ERROR nova.compute.manager [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] [ 884.185790] env[59379]: DEBUG nova.compute.utils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 884.192456] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Build of instance 19253198-cb6e-4c48-a88b-26780f3606e8 was re-scheduled: A specified parameter was not correct: fileType [ 884.192456] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 884.192456] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 884.192456] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 884.192456] env[59379]: DEBUG nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 884.192723] env[59379]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 884.922733] env[59379]: DEBUG nova.network.neutron [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 884.938502] env[59379]: INFO nova.compute.manager [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Took 0.75 seconds to deallocate network for instance. [ 885.045906] env[59379]: INFO nova.scheduler.client.report [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Deleted allocations for instance 19253198-cb6e-4c48-a88b-26780f3606e8 [ 885.066761] env[59379]: DEBUG oslo_concurrency.lockutils [None req-93123fb2-1963-4c28-b1cc-c45b27ef1fff tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "19253198-cb6e-4c48-a88b-26780f3606e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 296.709s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.067963] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "19253198-cb6e-4c48-a88b-26780f3606e8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 291.221s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 885.068169] env[59379]: INFO nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] During sync_power_state the instance has a pending task (spawning). Skip. [ 885.068334] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "19253198-cb6e-4c48-a88b-26780f3606e8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.068773] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "19253198-cb6e-4c48-a88b-26780f3606e8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 93.603s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 885.068987] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Acquiring lock "19253198-cb6e-4c48-a88b-26780f3606e8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 885.069615] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "19253198-cb6e-4c48-a88b-26780f3606e8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 885.069944] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "19253198-cb6e-4c48-a88b-26780f3606e8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.071709] env[59379]: INFO nova.compute.manager [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Terminating instance [ 885.073443] env[59379]: DEBUG nova.compute.manager [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 885.073640] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 885.074254] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7a218a05-d1ec-4241-ae57-cd656517d9b1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.085747] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-746b6698-a6f5-4c31-99a7-cc25bacdcfeb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.098683] env[59379]: DEBUG nova.compute.manager [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 5df12084-5dd6-41d1-9743-747f17ce3323] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 885.118604] env[59379]: WARNING nova.virt.vmwareapi.vmops [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 19253198-cb6e-4c48-a88b-26780f3606e8 could not be found. [ 885.118816] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 885.118987] env[59379]: INFO nova.compute.manager [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Took 0.05 seconds to destroy the instance on the hypervisor. [ 885.119349] env[59379]: DEBUG oslo.service.loopingcall [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 885.119556] env[59379]: DEBUG nova.compute.manager [-] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 885.119650] env[59379]: DEBUG nova.network.neutron [-] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 885.124750] env[59379]: DEBUG nova.compute.manager [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] [instance: 5df12084-5dd6-41d1-9743-747f17ce3323] Instance disappeared before build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 885.145785] env[59379]: DEBUG oslo_concurrency.lockutils [None req-548c9425-6a50-4d5f-a318-e114e48dce4f tempest-DeleteServersTestJSON-1450295516 tempest-DeleteServersTestJSON-1450295516-project-member] Lock "5df12084-5dd6-41d1-9743-747f17ce3323" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.856s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.154695] env[59379]: DEBUG nova.compute.manager [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] [instance: 2ed6496a-3e75-4cfd-88da-9e0b731f738a] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 885.158304] env[59379]: DEBUG nova.network.neutron [-] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 885.166124] env[59379]: INFO nova.compute.manager [-] [instance: 19253198-cb6e-4c48-a88b-26780f3606e8] Took 0.05 seconds to deallocate network for instance. [ 885.187219] env[59379]: DEBUG nova.compute.manager [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] [instance: 2ed6496a-3e75-4cfd-88da-9e0b731f738a] Instance disappeared before build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 885.224261] env[59379]: DEBUG oslo_concurrency.lockutils [None req-1a5f73e9-aea6-4191-a8b4-431765861858 tempest-ServerGroupTestJSON-1237952674 tempest-ServerGroupTestJSON-1237952674-project-member] Lock "2ed6496a-3e75-4cfd-88da-9e0b731f738a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.342s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.239886] env[59379]: DEBUG nova.compute.manager [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 49d76773-e163-440b-aa99-08c379155149] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 885.270715] env[59379]: DEBUG nova.compute.manager [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] [instance: 49d76773-e163-440b-aa99-08c379155149] Instance disappeared before build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 885.298777] env[59379]: DEBUG oslo_concurrency.lockutils [None req-7a018914-2a3f-4174-be84-1fe6925178cc tempest-ServerExternalEventsTest-827682188 tempest-ServerExternalEventsTest-827682188-project-member] Lock "19253198-cb6e-4c48-a88b-26780f3606e8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.230s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.305303] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5988d750-59d3-4b51-8b34-9653780a2324 tempest-ServersTestJSON-1871376988 tempest-ServersTestJSON-1871376988-project-member] Lock "49d76773-e163-440b-aa99-08c379155149" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.966s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.319749] env[59379]: DEBUG nova.compute.manager [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: dac8465a-592f-461c-af5b-49369eed5e70] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 885.357204] env[59379]: DEBUG nova.compute.manager [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] [instance: dac8465a-592f-461c-af5b-49369eed5e70] Instance disappeared before build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 885.387807] env[59379]: DEBUG oslo_concurrency.lockutils [None req-3ac6f48c-ebc2-4bc7-9198-de6d492764ef tempest-ServerDiskConfigTestJSON-1795533670 tempest-ServerDiskConfigTestJSON-1795533670-project-member] Lock "dac8465a-592f-461c-af5b-49369eed5e70" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.915s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.399617] env[59379]: DEBUG nova.compute.manager [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] [instance: 54605814-fdf4-43c7-9316-0d2594cdb5fa] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 885.428173] env[59379]: DEBUG nova.compute.manager [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] [instance: 54605814-fdf4-43c7-9316-0d2594cdb5fa] Instance disappeared before build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 885.453476] env[59379]: DEBUG oslo_concurrency.lockutils [None req-b4c2d3a9-5cd7-4113-8d35-670bb3ea1350 tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "54605814-fdf4-43c7-9316-0d2594cdb5fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.563s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.463620] env[59379]: DEBUG nova.compute.manager [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] [instance: f196648e-0e82-4a01-91fc-af1ba61f0490] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 885.496282] env[59379]: DEBUG nova.compute.manager [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] [instance: f196648e-0e82-4a01-91fc-af1ba61f0490] Instance disappeared before build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 885.526280] env[59379]: DEBUG oslo_concurrency.lockutils [None req-1f90d099-0f06-4f6c-8f82-3f162df27704 tempest-FloatingIPsAssociationTestJSON-1854484624 tempest-FloatingIPsAssociationTestJSON-1854484624-project-member] Lock "f196648e-0e82-4a01-91fc-af1ba61f0490" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.923s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.536937] env[59379]: DEBUG nova.compute.manager [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] [instance: 66420486-d25e-457d-94cd-6f96fca2df7d] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 885.567727] env[59379]: DEBUG nova.compute.manager [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] [instance: 66420486-d25e-457d-94cd-6f96fca2df7d] Instance disappeared before build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 885.590922] env[59379]: DEBUG oslo_concurrency.lockutils [None req-eb8449b9-6d26-4a9d-a599-af26500e764c tempest-ServerShowV257Test-322135073 tempest-ServerShowV257Test-322135073-project-member] Lock "66420486-d25e-457d-94cd-6f96fca2df7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.843s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.600291] env[59379]: DEBUG nova.compute.manager [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] [instance: a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 885.626576] env[59379]: DEBUG nova.compute.manager [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] [instance: a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0] Instance disappeared before build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 885.653019] env[59379]: DEBUG oslo_concurrency.lockutils [None req-01085bf9-6130-43fe-8c8e-78467ff0241a tempest-ServerShowV247Test-1223997438 tempest-ServerShowV247Test-1223997438-project-member] Lock "a1dd5ca8-4210-4950-8a6e-a7e05b9a38a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.508s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.662546] env[59379]: DEBUG nova.compute.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 885.724567] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 885.724976] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 885.726724] env[59379]: INFO nova.compute.claims [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 885.897401] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df9f1f0b-213e-4e3b-9f1c-ad846056e6f0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.906724] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baf787eb-3144-40b2-a0cb-bb694b462236 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.937609] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d316e3c5-64b8-4d8b-9f74-8a3229414d1d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.944515] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07439d7d-aebc-4e73-89a1-a40b63b91711 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.958778] env[59379]: DEBUG nova.compute.provider_tree [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 885.968796] env[59379]: DEBUG nova.scheduler.client.report [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 885.988019] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 885.988019] env[59379]: DEBUG nova.compute.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 886.026038] env[59379]: DEBUG nova.compute.utils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 886.027562] env[59379]: DEBUG nova.compute.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 886.027562] env[59379]: DEBUG nova.network.neutron [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 886.040105] env[59379]: DEBUG nova.compute.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 886.072487] env[59379]: INFO nova.virt.block_device [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Booting with volume 9c19a3a5-ade9-4b1f-afd2-6e797690b152 at /dev/sda [ 886.107998] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-13804fd8-8ed5-43a4-8f52-ad403777198d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.114298] env[59379]: DEBUG nova.policy [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '046a5a655c774306b2680b89927ba285', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '396b40d8809545dc8eeb0fc355cfcc58', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 886.119489] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa431ce8-997c-48d3-ac66-288c4bfd8461 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.149203] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fcad4af4-1051-4704-9cd3-7b77789e49f7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.157452] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3d5ab7f-f7c0-47e8-8d5c-8cc95ac28cf6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.185692] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b608dc33-6dd3-4031-9f0c-82e37a625b2e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.192037] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4533cffa-4d2c-47db-a06e-4bc6a46fb7c1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.204854] env[59379]: DEBUG nova.virt.block_device [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Updating existing volume attachment record: 02d65261-f106-4b85-8a32-25697f0b2252 {{(pid=59379) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 886.447468] env[59379]: DEBUG nova.compute.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 886.447993] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 886.448218] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 886.448359] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 886.448528] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 886.448661] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 886.448795] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 886.448993] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 886.449270] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 886.449447] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 886.450026] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 886.450240] env[59379]: DEBUG nova.virt.hardware [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 886.451743] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-958e1ef0-e9b4-4b4f-93a6-15a1d4c69413 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.461653] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e56ec6f2-b7ed-4e5b-aadd-bdcc82b4fba6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.650600] env[59379]: DEBUG nova.network.neutron [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Successfully created port: e2da95d8-c28e-422e-a2bc-70407e33455f {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 887.348114] env[59379]: DEBUG nova.compute.manager [req-c92c07de-1752-4dca-9ceb-72bb34551a8d req-eea7334e-8292-45d4-914d-e114236d1e8b service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Received event network-vif-plugged-e2da95d8-c28e-422e-a2bc-70407e33455f {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 887.348406] env[59379]: DEBUG oslo_concurrency.lockutils [req-c92c07de-1752-4dca-9ceb-72bb34551a8d req-eea7334e-8292-45d4-914d-e114236d1e8b service nova] Acquiring lock "13aee471-4813-4376-a7bf-70f266d9a399-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 887.348514] env[59379]: DEBUG oslo_concurrency.lockutils [req-c92c07de-1752-4dca-9ceb-72bb34551a8d req-eea7334e-8292-45d4-914d-e114236d1e8b service nova] Lock "13aee471-4813-4376-a7bf-70f266d9a399-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 887.348665] env[59379]: DEBUG oslo_concurrency.lockutils [req-c92c07de-1752-4dca-9ceb-72bb34551a8d req-eea7334e-8292-45d4-914d-e114236d1e8b service nova] Lock "13aee471-4813-4376-a7bf-70f266d9a399-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 887.348820] env[59379]: DEBUG nova.compute.manager [req-c92c07de-1752-4dca-9ceb-72bb34551a8d req-eea7334e-8292-45d4-914d-e114236d1e8b service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] No waiting events found dispatching network-vif-plugged-e2da95d8-c28e-422e-a2bc-70407e33455f {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 887.348976] env[59379]: WARNING nova.compute.manager [req-c92c07de-1752-4dca-9ceb-72bb34551a8d req-eea7334e-8292-45d4-914d-e114236d1e8b service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Received unexpected event network-vif-plugged-e2da95d8-c28e-422e-a2bc-70407e33455f for instance with vm_state building and task_state spawning. [ 887.724211] env[59379]: DEBUG nova.network.neutron [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Successfully updated port: e2da95d8-c28e-422e-a2bc-70407e33455f {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 887.733716] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 887.733848] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquired lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 887.733991] env[59379]: DEBUG nova.network.neutron [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 887.827265] env[59379]: DEBUG nova.network.neutron [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 888.177042] env[59379]: DEBUG nova.network.neutron [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Updating instance_info_cache with network_info: [{"id": "e2da95d8-c28e-422e-a2bc-70407e33455f", "address": "fa:16:3e:a0:6e:26", "network": {"id": "2f2053e4-da0c-4230-aad7-e0062af41d2e", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-327622125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "396b40d8809545dc8eeb0fc355cfcc58", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f2c424c9-6446-4b2a-af8c-4d9c29117c39", "external-id": "nsx-vlan-transportzone-437", "segmentation_id": 437, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2da95d8-c2", "ovs_interfaceid": "e2da95d8-c28e-422e-a2bc-70407e33455f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 888.195348] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Releasing lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 888.195348] env[59379]: DEBUG nova.compute.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Instance network_info: |[{"id": "e2da95d8-c28e-422e-a2bc-70407e33455f", "address": "fa:16:3e:a0:6e:26", "network": {"id": "2f2053e4-da0c-4230-aad7-e0062af41d2e", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-327622125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "396b40d8809545dc8eeb0fc355cfcc58", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f2c424c9-6446-4b2a-af8c-4d9c29117c39", "external-id": "nsx-vlan-transportzone-437", "segmentation_id": 437, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2da95d8-c2", "ovs_interfaceid": "e2da95d8-c28e-422e-a2bc-70407e33455f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 888.195485] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a0:6e:26', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f2c424c9-6446-4b2a-af8c-4d9c29117c39', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e2da95d8-c28e-422e-a2bc-70407e33455f', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 888.201363] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Creating folder: Project (396b40d8809545dc8eeb0fc355cfcc58). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 888.202038] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1f85513b-4e19-4397-bc51-b0313a123530 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.216610] env[59379]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 888.216610] env[59379]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=59379) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 888.216610] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Folder already exists: Project (396b40d8809545dc8eeb0fc355cfcc58). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 888.216610] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Creating folder: Instances. Parent ref: group-v140550. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 888.217381] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f2e8734f-cdb2-4912-865c-e8549fe729ea {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.226705] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Created folder: Instances in parent group-v140550. [ 888.226994] env[59379]: DEBUG oslo.service.loopingcall [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 888.227190] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 888.227376] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-18bf4db1-8f29-4f5f-acf7-c01f79d2c443 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.252295] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 888.252295] env[59379]: value = "task-559623" [ 888.252295] env[59379]: _type = "Task" [ 888.252295] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 888.261911] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559623, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 888.763835] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559623, 'name': CreateVM_Task, 'duration_secs': 0.33445} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 888.764125] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 888.764650] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-140553', 'volume_id': '9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'name': 'volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '13aee471-4813-4376-a7bf-70f266d9a399', 'attached_at': '', 'detached_at': '', 'volume_id': '9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'serial': '9c19a3a5-ade9-4b1f-afd2-6e797690b152'}, 'delete_on_termination': True, 'guest_format': None, 'attachment_id': '02d65261-f106-4b85-8a32-25697f0b2252', 'mount_device': '/dev/sda', 'device_type': None, 'boot_index': 0, 'disk_bus': None, 'volume_type': None}], 'swap': None} {{(pid=59379) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 888.764848] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Root volume attach. Driver type: vmdk {{(pid=59379) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 888.765614] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8997b80a-08b8-4703-a282-b31c6f2445fe {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.774068] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc8c1aec-dce0-4603-b4cc-c63d5b061c61 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.781605] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e516e88-7938-4ec7-9cb1-ad2ca1d48763 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.790363] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-1fb69217-a575-4525-8a6f-54ff64256fd2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.797535] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for the task: (returnval){ [ 888.797535] env[59379]: value = "task-559624" [ 888.797535] env[59379]: _type = "Task" [ 888.797535] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 888.805438] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559624, 'name': RelocateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 889.312470] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559624, 'name': RelocateVM_Task} progress is 34%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 889.463391] env[59379]: DEBUG nova.compute.manager [req-c523c135-29be-4144-87d6-ce40c0dbc5dd req-960e2541-5f73-4a5f-809d-01fb661abdb5 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Received event network-changed-e2da95d8-c28e-422e-a2bc-70407e33455f {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 889.463391] env[59379]: DEBUG nova.compute.manager [req-c523c135-29be-4144-87d6-ce40c0dbc5dd req-960e2541-5f73-4a5f-809d-01fb661abdb5 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Refreshing instance network info cache due to event network-changed-e2da95d8-c28e-422e-a2bc-70407e33455f. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 889.463391] env[59379]: DEBUG oslo_concurrency.lockutils [req-c523c135-29be-4144-87d6-ce40c0dbc5dd req-960e2541-5f73-4a5f-809d-01fb661abdb5 service nova] Acquiring lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 889.463557] env[59379]: DEBUG oslo_concurrency.lockutils [req-c523c135-29be-4144-87d6-ce40c0dbc5dd req-960e2541-5f73-4a5f-809d-01fb661abdb5 service nova] Acquired lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 889.463836] env[59379]: DEBUG nova.network.neutron [req-c523c135-29be-4144-87d6-ce40c0dbc5dd req-960e2541-5f73-4a5f-809d-01fb661abdb5 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Refreshing network info cache for port e2da95d8-c28e-422e-a2bc-70407e33455f {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 889.815286] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559624, 'name': RelocateVM_Task} progress is 47%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 890.101888] env[59379]: DEBUG nova.network.neutron [req-c523c135-29be-4144-87d6-ce40c0dbc5dd req-960e2541-5f73-4a5f-809d-01fb661abdb5 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Updated VIF entry in instance network info cache for port e2da95d8-c28e-422e-a2bc-70407e33455f. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 890.102284] env[59379]: DEBUG nova.network.neutron [req-c523c135-29be-4144-87d6-ce40c0dbc5dd req-960e2541-5f73-4a5f-809d-01fb661abdb5 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Updating instance_info_cache with network_info: [{"id": "e2da95d8-c28e-422e-a2bc-70407e33455f", "address": "fa:16:3e:a0:6e:26", "network": {"id": "2f2053e4-da0c-4230-aad7-e0062af41d2e", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-327622125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "396b40d8809545dc8eeb0fc355cfcc58", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f2c424c9-6446-4b2a-af8c-4d9c29117c39", "external-id": "nsx-vlan-transportzone-437", "segmentation_id": 437, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2da95d8-c2", "ovs_interfaceid": "e2da95d8-c28e-422e-a2bc-70407e33455f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 890.112805] env[59379]: DEBUG oslo_concurrency.lockutils [req-c523c135-29be-4144-87d6-ce40c0dbc5dd req-960e2541-5f73-4a5f-809d-01fb661abdb5 service nova] Releasing lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 890.313954] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559624, 'name': RelocateVM_Task} progress is 60%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 890.812325] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559624, 'name': RelocateVM_Task} progress is 73%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 891.312044] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559624, 'name': RelocateVM_Task} progress is 88%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 891.814757] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559624, 'name': RelocateVM_Task} progress is 97%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 892.316754] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559624, 'name': RelocateVM_Task, 'duration_secs': 3.504138} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 892.317745] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Volume attach. Driver type: vmdk {{(pid=59379) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 892.317881] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-140553', 'volume_id': '9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'name': 'volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '13aee471-4813-4376-a7bf-70f266d9a399', 'attached_at': '', 'detached_at': '', 'volume_id': '9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'serial': '9c19a3a5-ade9-4b1f-afd2-6e797690b152'} {{(pid=59379) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 892.318943] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9de7d272-8db4-4359-aa27-4e752e7d58a8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.339091] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f9ddb99-ac10-4c1b-b99a-9efd09aec110 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.363788] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Reconfiguring VM instance instance-0000001c to attach disk [datastore1] volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152/volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152.vmdk or device None with type thin {{(pid=59379) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 892.364316] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-c99a281c-64b5-4b93-9573-e11a7d3321db {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.383442] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for the task: (returnval){ [ 892.383442] env[59379]: value = "task-559625" [ 892.383442] env[59379]: _type = "Task" [ 892.383442] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 892.391461] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559625, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 892.893619] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559625, 'name': ReconfigVM_Task, 'duration_secs': 0.272452} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 892.893758] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Reconfigured VM instance instance-0000001c to attach disk [datastore1] volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152/volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152.vmdk or device None with type thin {{(pid=59379) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 892.898477] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-6bed1ec5-ca87-4e7e-8ec8-d512f5c7b836 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 892.913209] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for the task: (returnval){ [ 892.913209] env[59379]: value = "task-559626" [ 892.913209] env[59379]: _type = "Task" [ 892.913209] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 892.920972] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559626, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 893.424424] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559626, 'name': ReconfigVM_Task, 'duration_secs': 0.126862} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 893.424694] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-140553', 'volume_id': '9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'name': 'volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '13aee471-4813-4376-a7bf-70f266d9a399', 'attached_at': '', 'detached_at': '', 'volume_id': '9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'serial': '9c19a3a5-ade9-4b1f-afd2-6e797690b152'} {{(pid=59379) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 893.424992] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-4e44c130-092e-4836-bd11-e3f30b7a6c70 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.431614] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for the task: (returnval){ [ 893.431614] env[59379]: value = "task-559627" [ 893.431614] env[59379]: _type = "Task" [ 893.431614] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 893.443609] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559627, 'name': Rename_Task} progress is 5%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 893.944454] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559627, 'name': Rename_Task, 'duration_secs': 0.130495} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 893.944454] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Powering on the VM {{(pid=59379) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 893.944454] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-faf63ae3-33fb-4af8-8e0d-5f160a5319e1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 893.954396] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for the task: (returnval){ [ 893.954396] env[59379]: value = "task-559628" [ 893.954396] env[59379]: _type = "Task" [ 893.954396] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 893.961845] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559628, 'name': PowerOnVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 894.466444] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559628, 'name': PowerOnVM_Task} progress is 66%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 894.964634] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559628, 'name': PowerOnVM_Task} progress is 100%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 895.466034] env[59379]: DEBUG oslo_vmware.api [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559628, 'name': PowerOnVM_Task, 'duration_secs': 1.013687} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 895.466479] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Powered on the VM {{(pid=59379) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 895.467061] env[59379]: INFO nova.compute.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Took 9.02 seconds to spawn the instance on the hypervisor. [ 895.467388] env[59379]: DEBUG nova.compute.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Checking state {{(pid=59379) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 895.468305] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b35440a-7daa-431f-8a49-d195157e06b7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.525783] env[59379]: INFO nova.compute.manager [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Took 9.81 seconds to build instance. [ 895.546020] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e35a761d-52ae-4903-adda-5b41a0cf70b8 tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "13aee471-4813-4376-a7bf-70f266d9a399" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 178.903s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 895.555035] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 895.613416] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 895.613500] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 895.618150] env[59379]: INFO nova.compute.claims [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 895.818186] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d84f33f3-0a19-4476-8324-aebad15df59b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.826301] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e3ea51a-ba7d-41fe-9b14-ae66e786bf28 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.856012] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09cdd1c5-c0a9-4541-b954-2b2cb4625582 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.863632] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4acc4693-ec64-4f3a-bbb0-e481e1dcdf6d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.878985] env[59379]: DEBUG nova.compute.provider_tree [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 895.888506] env[59379]: DEBUG nova.scheduler.client.report [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 895.900619] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 895.901121] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 895.950486] env[59379]: DEBUG nova.compute.utils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 895.952000] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 895.952176] env[59379]: DEBUG nova.network.neutron [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 895.967658] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 896.039448] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 896.055059] env[59379]: DEBUG nova.policy [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f336cb51b0c249b09243817121e20c63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '60a64c047df340008256e691a618d959', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 896.067976] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 896.068135] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 896.068285] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 896.068476] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 896.068591] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 896.068731] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 896.068930] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 896.069392] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 896.069605] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 896.069773] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 896.069969] env[59379]: DEBUG nova.virt.hardware [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 896.071479] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4330c93d-d27e-40c1-810e-8ec001aeb9c4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.083345] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a9e5717-9202-4804-84e2-eee303eb6727 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.369022] env[59379]: WARNING oslo_vmware.rw_handles [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 896.369022] env[59379]: ERROR oslo_vmware.rw_handles [ 896.369022] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/1d0ffc5b-d86b-4316-b501-6d50ab03f669/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 896.369577] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 896.369669] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Copying Virtual Disk [datastore1] vmware_temp/1d0ffc5b-d86b-4316-b501-6d50ab03f669/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/1d0ffc5b-d86b-4316-b501-6d50ab03f669/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 896.370124] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-667da574-13eb-4304-9f21-d602e838c067 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.377514] env[59379]: DEBUG oslo_vmware.api [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Waiting for the task: (returnval){ [ 896.377514] env[59379]: value = "task-559629" [ 896.377514] env[59379]: _type = "Task" [ 896.377514] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 896.386102] env[59379]: DEBUG oslo_vmware.api [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Task: {'id': task-559629, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 896.892572] env[59379]: DEBUG oslo_vmware.exceptions [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 896.893516] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 896.897605] env[59379]: ERROR nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 896.897605] env[59379]: Faults: ['InvalidArgument'] [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Traceback (most recent call last): [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] yield resources [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] self.driver.spawn(context, instance, image_meta, [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] self._fetch_image_if_missing(context, vi) [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 896.897605] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] image_cache(vi, tmp_image_ds_loc) [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] vm_util.copy_virtual_disk( [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] session._wait_for_task(vmdk_copy_task) [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] return self.wait_for_task(task_ref) [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] return evt.wait() [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] result = hub.switch() [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] return self.greenlet.switch() [ 896.898121] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 896.898561] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] self.f(*self.args, **self.kw) [ 896.898561] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 896.898561] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] raise exceptions.translate_fault(task_info.error) [ 896.898561] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 896.898561] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Faults: ['InvalidArgument'] [ 896.898561] env[59379]: ERROR nova.compute.manager [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] [ 896.898561] env[59379]: INFO nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Terminating instance [ 896.901212] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 896.901212] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 896.901591] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 896.901793] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 896.902085] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cf2b1b6d-d3d3-4460-a268-ee39bf02d2ea {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.904917] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6051dc6b-3b6b-4f69-bd22-f6b640709088 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.916232] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 896.916570] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4639c275-9af3-4ec3-a176-5583319c0c6b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.918097] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 896.918263] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 896.918948] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e5bae61b-068b-4044-bb76-7b3e68d755b7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.924751] env[59379]: DEBUG oslo_vmware.api [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Waiting for the task: (returnval){ [ 896.924751] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52c242b7-4362-78cf-e67d-43e9eee3396b" [ 896.924751] env[59379]: _type = "Task" [ 896.924751] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 896.939305] env[59379]: DEBUG oslo_vmware.api [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52c242b7-4362-78cf-e67d-43e9eee3396b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 896.952641] env[59379]: DEBUG nova.network.neutron [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Successfully created port: 3aabbd9f-c7ef-4867-9ac8-dfea570218c7 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 896.996723] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 896.996826] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 896.996958] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Deleting the datastore file [datastore1] a6ff207e-a925-46d1-9aaf-e06268d3c6f2 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 896.997233] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-48dff2fd-5bc6-445b-b1ba-53ab58059be1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.003467] env[59379]: DEBUG oslo_vmware.api [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Waiting for the task: (returnval){ [ 897.003467] env[59379]: value = "task-559631" [ 897.003467] env[59379]: _type = "Task" [ 897.003467] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 897.012885] env[59379]: DEBUG oslo_vmware.api [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Task: {'id': task-559631, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 897.442471] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 897.442471] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Creating directory with path [datastore1] vmware_temp/ee5dc068-96af-45fb-8ced-f3a158c69906/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 897.442471] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-039e02a1-9326-4d29-9775-059e354c5f94 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.458313] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Created directory with path [datastore1] vmware_temp/ee5dc068-96af-45fb-8ced-f3a158c69906/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 897.458313] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Fetch image to [datastore1] vmware_temp/ee5dc068-96af-45fb-8ced-f3a158c69906/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 897.458313] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/ee5dc068-96af-45fb-8ced-f3a158c69906/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 897.458313] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9dda5b1-a469-48e9-af83-6b41beae5866 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.467162] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-920ae1cc-515a-4835-9d25-f20e820a167d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.483013] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75caa7cc-5588-44e9-a8d3-24c69f6ec16b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.523832] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89c1c57e-5d6c-431b-bd11-eb655a068184 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.534342] env[59379]: DEBUG oslo_vmware.api [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Task: {'id': task-559631, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.085059} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 897.535230] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 897.535677] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 897.535786] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 897.537095] env[59379]: INFO nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Took 0.63 seconds to destroy the instance on the hypervisor. [ 897.537899] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3eed1b33-a5c2-4ba9-a9e8-b1e30f639960 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 897.540739] env[59379]: DEBUG nova.compute.claims [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 897.541024] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 897.541299] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 897.570225] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 897.581291] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.039s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.581730] env[59379]: DEBUG nova.compute.utils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Instance a6ff207e-a925-46d1-9aaf-e06268d3c6f2 could not be found. {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 897.583795] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Instance disappeared during build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 897.585083] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 897.585083] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 897.585083] env[59379]: DEBUG nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 897.585083] env[59379]: DEBUG nova.network.neutron [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 897.626769] env[59379]: DEBUG nova.network.neutron [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 897.639253] env[59379]: DEBUG oslo_vmware.rw_handles [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ee5dc068-96af-45fb-8ced-f3a158c69906/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 897.706961] env[59379]: INFO nova.compute.manager [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Took 0.12 seconds to deallocate network for instance. [ 897.711933] env[59379]: DEBUG oslo_vmware.rw_handles [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 897.712167] env[59379]: DEBUG oslo_vmware.rw_handles [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ee5dc068-96af-45fb-8ced-f3a158c69906/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 897.780404] env[59379]: DEBUG oslo_concurrency.lockutils [None req-75c8ffdd-c2cb-4058-970c-d0dc2a096f28 tempest-ImagesTestJSON-677661905 tempest-ImagesTestJSON-677661905-project-member] Lock "a6ff207e-a925-46d1-9aaf-e06268d3c6f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.202s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.801547] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 897.810305] env[59379]: DEBUG nova.compute.manager [req-ef67d5ea-7ac6-49b0-b4e5-4be9814b17a1 req-9167a1a0-dd8c-42c9-8d7e-1ae401e72c0a service nova] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Received event network-vif-plugged-3aabbd9f-c7ef-4867-9ac8-dfea570218c7 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 897.810611] env[59379]: DEBUG oslo_concurrency.lockutils [req-ef67d5ea-7ac6-49b0-b4e5-4be9814b17a1 req-9167a1a0-dd8c-42c9-8d7e-1ae401e72c0a service nova] Acquiring lock "06d5ac6a-7734-46e3-80c5-d960821b7552-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 897.811703] env[59379]: DEBUG oslo_concurrency.lockutils [req-ef67d5ea-7ac6-49b0-b4e5-4be9814b17a1 req-9167a1a0-dd8c-42c9-8d7e-1ae401e72c0a service nova] Lock "06d5ac6a-7734-46e3-80c5-d960821b7552-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 897.811703] env[59379]: DEBUG oslo_concurrency.lockutils [req-ef67d5ea-7ac6-49b0-b4e5-4be9814b17a1 req-9167a1a0-dd8c-42c9-8d7e-1ae401e72c0a service nova] Lock "06d5ac6a-7734-46e3-80c5-d960821b7552-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.811703] env[59379]: DEBUG nova.compute.manager [req-ef67d5ea-7ac6-49b0-b4e5-4be9814b17a1 req-9167a1a0-dd8c-42c9-8d7e-1ae401e72c0a service nova] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] No waiting events found dispatching network-vif-plugged-3aabbd9f-c7ef-4867-9ac8-dfea570218c7 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 897.811823] env[59379]: WARNING nova.compute.manager [req-ef67d5ea-7ac6-49b0-b4e5-4be9814b17a1 req-9167a1a0-dd8c-42c9-8d7e-1ae401e72c0a service nova] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Received unexpected event network-vif-plugged-3aabbd9f-c7ef-4867-9ac8-dfea570218c7 for instance with vm_state building and task_state spawning. [ 897.863359] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 897.863618] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 897.865556] env[59379]: INFO nova.compute.claims [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 897.897791] env[59379]: DEBUG nova.network.neutron [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Successfully updated port: 3aabbd9f-c7ef-4867-9ac8-dfea570218c7 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 897.917029] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "refresh_cache-06d5ac6a-7734-46e3-80c5-d960821b7552" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 897.917029] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquired lock "refresh_cache-06d5ac6a-7734-46e3-80c5-d960821b7552" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 897.917029] env[59379]: DEBUG nova.network.neutron [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 897.994345] env[59379]: DEBUG nova.network.neutron [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 898.159952] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e323a0d-aed6-45b3-a8cd-1e68cd01fd52 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.168426] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d7b6437-ebde-455e-ac10-28e5bbdd13da {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.203545] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3eddbad6-669a-4d60-b637-0b14052a14c5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.215348] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ab5528c-7500-4a51-a0a1-f1d0f3a444b4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.236589] env[59379]: DEBUG nova.compute.provider_tree [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 898.256652] env[59379]: DEBUG nova.scheduler.client.report [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 898.276784] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.413s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.278034] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 898.325306] env[59379]: DEBUG nova.compute.utils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 898.330019] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 898.330019] env[59379]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 898.340391] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 898.431179] env[59379]: DEBUG nova.compute.manager [req-3db0100b-9aa5-4df9-9f37-9a89369b2fa3 req-b35a42a7-02f7-4d81-aae9-771122515275 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Received event network-changed-e2da95d8-c28e-422e-a2bc-70407e33455f {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 898.431317] env[59379]: DEBUG nova.compute.manager [req-3db0100b-9aa5-4df9-9f37-9a89369b2fa3 req-b35a42a7-02f7-4d81-aae9-771122515275 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Refreshing instance network info cache due to event network-changed-e2da95d8-c28e-422e-a2bc-70407e33455f. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 898.431524] env[59379]: DEBUG oslo_concurrency.lockutils [req-3db0100b-9aa5-4df9-9f37-9a89369b2fa3 req-b35a42a7-02f7-4d81-aae9-771122515275 service nova] Acquiring lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 898.431653] env[59379]: DEBUG oslo_concurrency.lockutils [req-3db0100b-9aa5-4df9-9f37-9a89369b2fa3 req-b35a42a7-02f7-4d81-aae9-771122515275 service nova] Acquired lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 898.431802] env[59379]: DEBUG nova.network.neutron [req-3db0100b-9aa5-4df9-9f37-9a89369b2fa3 req-b35a42a7-02f7-4d81-aae9-771122515275 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Refreshing network info cache for port e2da95d8-c28e-422e-a2bc-70407e33455f {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 898.433246] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 898.439021] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 898.472890] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 898.473197] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 898.473320] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 898.473494] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 898.473636] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 898.473792] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 898.474038] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 898.474254] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 898.475180] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 898.475180] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 898.475180] env[59379]: DEBUG nova.virt.hardware [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 898.480996] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b205f78a-257a-4958-bf86-943e9e793c12 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.492222] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b2da745-a4a0-47db-bc96-29a7be571926 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.593546] env[59379]: DEBUG nova.network.neutron [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Updating instance_info_cache with network_info: [{"id": "3aabbd9f-c7ef-4867-9ac8-dfea570218c7", "address": "fa:16:3e:8a:df:ab", "network": {"id": "29dc264e-1d63-4e97-ad29-8ba15d5b66cc", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1461808026-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "60a64c047df340008256e691a618d959", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4b43a78-f49b-4132-ab2e-6e28769a9498", "external-id": "nsx-vlan-transportzone-737", "segmentation_id": 737, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3aabbd9f-c7", "ovs_interfaceid": "3aabbd9f-c7ef-4867-9ac8-dfea570218c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 898.602659] env[59379]: DEBUG nova.policy [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2e0c20ce66e045a5bfdffc27e037327e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd239a4f0ed5b48cf9cd9a334de6f189c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 898.610232] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Releasing lock "refresh_cache-06d5ac6a-7734-46e3-80c5-d960821b7552" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 898.610486] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Instance network_info: |[{"id": "3aabbd9f-c7ef-4867-9ac8-dfea570218c7", "address": "fa:16:3e:8a:df:ab", "network": {"id": "29dc264e-1d63-4e97-ad29-8ba15d5b66cc", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1461808026-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "60a64c047df340008256e691a618d959", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4b43a78-f49b-4132-ab2e-6e28769a9498", "external-id": "nsx-vlan-transportzone-737", "segmentation_id": 737, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3aabbd9f-c7", "ovs_interfaceid": "3aabbd9f-c7ef-4867-9ac8-dfea570218c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 898.610735] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8a:df:ab', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd4b43a78-f49b-4132-ab2e-6e28769a9498', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3aabbd9f-c7ef-4867-9ac8-dfea570218c7', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 898.619222] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Creating folder: Project (60a64c047df340008256e691a618d959). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 898.619872] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7891d331-d29b-4083-993e-4238e709a7fd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.633334] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Created folder: Project (60a64c047df340008256e691a618d959) in parent group-v140509. [ 898.636178] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Creating folder: Instances. Parent ref: group-v140576. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 898.636178] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ec659ada-60eb-41cf-93c6-35eca1947f33 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.647701] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Created folder: Instances in parent group-v140576. [ 898.647701] env[59379]: DEBUG oslo.service.loopingcall [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 898.647701] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 898.647701] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dc90d052-bcb5-449b-ac78-100579353008 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.673175] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 898.673175] env[59379]: value = "task-559634" [ 898.673175] env[59379]: _type = "Task" [ 898.673175] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 898.682233] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559634, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 899.031022] env[59379]: DEBUG nova.network.neutron [req-3db0100b-9aa5-4df9-9f37-9a89369b2fa3 req-b35a42a7-02f7-4d81-aae9-771122515275 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Updated VIF entry in instance network info cache for port e2da95d8-c28e-422e-a2bc-70407e33455f. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 899.031566] env[59379]: DEBUG nova.network.neutron [req-3db0100b-9aa5-4df9-9f37-9a89369b2fa3 req-b35a42a7-02f7-4d81-aae9-771122515275 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Updating instance_info_cache with network_info: [{"id": "e2da95d8-c28e-422e-a2bc-70407e33455f", "address": "fa:16:3e:a0:6e:26", "network": {"id": "2f2053e4-da0c-4230-aad7-e0062af41d2e", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-327622125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "396b40d8809545dc8eeb0fc355cfcc58", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f2c424c9-6446-4b2a-af8c-4d9c29117c39", "external-id": "nsx-vlan-transportzone-437", "segmentation_id": 437, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2da95d8-c2", "ovs_interfaceid": "e2da95d8-c28e-422e-a2bc-70407e33455f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 899.052159] env[59379]: DEBUG oslo_concurrency.lockutils [req-3db0100b-9aa5-4df9-9f37-9a89369b2fa3 req-b35a42a7-02f7-4d81-aae9-771122515275 service nova] Releasing lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 899.183725] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559634, 'name': CreateVM_Task, 'duration_secs': 0.489709} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 899.183880] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 899.184592] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 899.184831] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 899.185417] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 899.185417] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-076bf51b-d1fb-4e9f-9fad-d59a740e5881 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 899.190169] env[59379]: DEBUG oslo_vmware.api [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Waiting for the task: (returnval){ [ 899.190169] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52397ad9-ff9c-df8c-8728-3d7092383f6f" [ 899.190169] env[59379]: _type = "Task" [ 899.190169] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 899.198909] env[59379]: DEBUG oslo_vmware.api [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52397ad9-ff9c-df8c-8728-3d7092383f6f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 899.428709] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 899.521995] env[59379]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Successfully created port: 592cf8cf-5dcb-4a78-bf0a-f4a9a8272068 {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 899.703053] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 899.703334] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 899.703655] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 899.982989] env[59379]: DEBUG nova.compute.manager [req-790af949-d021-41f8-bc7d-a89656790fe4 req-62b3a690-6e8e-4418-9895-84509f096f49 service nova] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Received event network-changed-3aabbd9f-c7ef-4867-9ac8-dfea570218c7 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 899.983237] env[59379]: DEBUG nova.compute.manager [req-790af949-d021-41f8-bc7d-a89656790fe4 req-62b3a690-6e8e-4418-9895-84509f096f49 service nova] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Refreshing instance network info cache due to event network-changed-3aabbd9f-c7ef-4867-9ac8-dfea570218c7. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 899.983446] env[59379]: DEBUG oslo_concurrency.lockutils [req-790af949-d021-41f8-bc7d-a89656790fe4 req-62b3a690-6e8e-4418-9895-84509f096f49 service nova] Acquiring lock "refresh_cache-06d5ac6a-7734-46e3-80c5-d960821b7552" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 899.986177] env[59379]: DEBUG oslo_concurrency.lockutils [req-790af949-d021-41f8-bc7d-a89656790fe4 req-62b3a690-6e8e-4418-9895-84509f096f49 service nova] Acquired lock "refresh_cache-06d5ac6a-7734-46e3-80c5-d960821b7552" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 899.986417] env[59379]: DEBUG nova.network.neutron [req-790af949-d021-41f8-bc7d-a89656790fe4 req-62b3a690-6e8e-4418-9895-84509f096f49 service nova] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Refreshing network info cache for port 3aabbd9f-c7ef-4867-9ac8-dfea570218c7 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 900.259502] env[59379]: DEBUG nova.network.neutron [req-790af949-d021-41f8-bc7d-a89656790fe4 req-62b3a690-6e8e-4418-9895-84509f096f49 service nova] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Updated VIF entry in instance network info cache for port 3aabbd9f-c7ef-4867-9ac8-dfea570218c7. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 900.259866] env[59379]: DEBUG nova.network.neutron [req-790af949-d021-41f8-bc7d-a89656790fe4 req-62b3a690-6e8e-4418-9895-84509f096f49 service nova] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Updating instance_info_cache with network_info: [{"id": "3aabbd9f-c7ef-4867-9ac8-dfea570218c7", "address": "fa:16:3e:8a:df:ab", "network": {"id": "29dc264e-1d63-4e97-ad29-8ba15d5b66cc", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1461808026-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "60a64c047df340008256e691a618d959", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4b43a78-f49b-4132-ab2e-6e28769a9498", "external-id": "nsx-vlan-transportzone-737", "segmentation_id": 737, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3aabbd9f-c7", "ovs_interfaceid": "3aabbd9f-c7ef-4867-9ac8-dfea570218c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 900.270110] env[59379]: DEBUG oslo_concurrency.lockutils [req-790af949-d021-41f8-bc7d-a89656790fe4 req-62b3a690-6e8e-4418-9895-84509f096f49 service nova] Releasing lock "refresh_cache-06d5ac6a-7734-46e3-80c5-d960821b7552" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 900.433751] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 900.448108] env[59379]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Successfully updated port: 592cf8cf-5dcb-4a78-bf0a-f4a9a8272068 {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 900.459261] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "refresh_cache-64bc3ac9-57b4-4f50-97fa-ba684c1595b4" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 900.459398] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquired lock "refresh_cache-64bc3ac9-57b4-4f50-97fa-ba684c1595b4" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 900.459537] env[59379]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 900.505215] env[59379]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 900.781505] env[59379]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Updating instance_info_cache with network_info: [{"id": "592cf8cf-5dcb-4a78-bf0a-f4a9a8272068", "address": "fa:16:3e:8c:d8:dc", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap592cf8cf-5d", "ovs_interfaceid": "592cf8cf-5dcb-4a78-bf0a-f4a9a8272068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 900.795259] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Releasing lock "refresh_cache-64bc3ac9-57b4-4f50-97fa-ba684c1595b4" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 900.795543] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Instance network_info: |[{"id": "592cf8cf-5dcb-4a78-bf0a-f4a9a8272068", "address": "fa:16:3e:8c:d8:dc", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap592cf8cf-5d", "ovs_interfaceid": "592cf8cf-5dcb-4a78-bf0a-f4a9a8272068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 900.796140] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8c:d8:dc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '778b9a40-d603-4765-ac88-bd6d42c457a2', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '592cf8cf-5dcb-4a78-bf0a-f4a9a8272068', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 900.804914] env[59379]: DEBUG oslo.service.loopingcall [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 900.805346] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 900.805554] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8a85a43a-07c7-4cd1-acd4-ebca90b949ce {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.827480] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 900.827480] env[59379]: value = "task-559635" [ 900.827480] env[59379]: _type = "Task" [ 900.827480] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 900.835696] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559635, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 901.338187] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559635, 'name': CreateVM_Task, 'duration_secs': 0.381603} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 901.338445] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 901.338973] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 901.339135] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 901.339458] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 901.339707] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bf0d8c74-dc72-4280-9c93-8c7731c6e28b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.344205] env[59379]: DEBUG oslo_vmware.api [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for the task: (returnval){ [ 901.344205] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]5226025a-ab68-e378-3790-0124fc2f03a2" [ 901.344205] env[59379]: _type = "Task" [ 901.344205] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 901.351861] env[59379]: DEBUG oslo_vmware.api [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]5226025a-ab68-e378-3790-0124fc2f03a2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 901.428533] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 901.855034] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 901.855285] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 901.855493] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 902.035258] env[59379]: DEBUG nova.compute.manager [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Received event network-vif-plugged-592cf8cf-5dcb-4a78-bf0a-f4a9a8272068 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 902.035468] env[59379]: DEBUG oslo_concurrency.lockutils [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] Acquiring lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 902.035662] env[59379]: DEBUG oslo_concurrency.lockutils [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] Lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 902.035815] env[59379]: DEBUG oslo_concurrency.lockutils [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] Lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 902.035973] env[59379]: DEBUG nova.compute.manager [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] No waiting events found dispatching network-vif-plugged-592cf8cf-5dcb-4a78-bf0a-f4a9a8272068 {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 902.036193] env[59379]: WARNING nova.compute.manager [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Received unexpected event network-vif-plugged-592cf8cf-5dcb-4a78-bf0a-f4a9a8272068 for instance with vm_state building and task_state spawning. [ 902.036318] env[59379]: DEBUG nova.compute.manager [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Received event network-changed-592cf8cf-5dcb-4a78-bf0a-f4a9a8272068 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 902.036482] env[59379]: DEBUG nova.compute.manager [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Refreshing instance network info cache due to event network-changed-592cf8cf-5dcb-4a78-bf0a-f4a9a8272068. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 902.036615] env[59379]: DEBUG oslo_concurrency.lockutils [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] Acquiring lock "refresh_cache-64bc3ac9-57b4-4f50-97fa-ba684c1595b4" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 902.036735] env[59379]: DEBUG oslo_concurrency.lockutils [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] Acquired lock "refresh_cache-64bc3ac9-57b4-4f50-97fa-ba684c1595b4" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 902.036916] env[59379]: DEBUG nova.network.neutron [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Refreshing network info cache for port 592cf8cf-5dcb-4a78-bf0a-f4a9a8272068 {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 902.795699] env[59379]: DEBUG nova.network.neutron [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Updated VIF entry in instance network info cache for port 592cf8cf-5dcb-4a78-bf0a-f4a9a8272068. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 902.795699] env[59379]: DEBUG nova.network.neutron [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Updating instance_info_cache with network_info: [{"id": "592cf8cf-5dcb-4a78-bf0a-f4a9a8272068", "address": "fa:16:3e:8c:d8:dc", "network": {"id": "0f9e232a-08ce-4a77-988f-3fc12cf9066f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.72", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "e8722a37ef8f4279abfe709d29d7d3ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "778b9a40-d603-4765-ac88-bd6d42c457a2", "external-id": "nsx-vlan-transportzone-114", "segmentation_id": 114, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap592cf8cf-5d", "ovs_interfaceid": "592cf8cf-5dcb-4a78-bf0a-f4a9a8272068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 902.807584] env[59379]: DEBUG oslo_concurrency.lockutils [req-037eed13-e36e-49f3-9430-f3deda4c8da0 req-5fd5d2ed-33f9-4ecf-a68e-4b8a60d3ad46 service nova] Releasing lock "refresh_cache-64bc3ac9-57b4-4f50-97fa-ba684c1595b4" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 903.435196] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 903.435196] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 903.435196] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 903.453064] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 903.453064] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 903.453064] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 903.453496] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 903.453763] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 903.454024] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 903.454303] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 903.481449] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 903.481793] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquired lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 903.482104] env[59379]: DEBUG nova.network.neutron [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Forcefully refreshing network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2003}} [ 903.482423] env[59379]: DEBUG nova.objects.instance [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lazy-loading 'info_cache' on Instance uuid 13aee471-4813-4376-a7bf-70f266d9a399 {{(pid=59379) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 903.765099] env[59379]: DEBUG nova.network.neutron [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Updating instance_info_cache with network_info: [{"id": "e2da95d8-c28e-422e-a2bc-70407e33455f", "address": "fa:16:3e:a0:6e:26", "network": {"id": "2f2053e4-da0c-4230-aad7-e0062af41d2e", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-327622125-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "396b40d8809545dc8eeb0fc355cfcc58", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f2c424c9-6446-4b2a-af8c-4d9c29117c39", "external-id": "nsx-vlan-transportzone-437", "segmentation_id": 437, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2da95d8-c2", "ovs_interfaceid": "e2da95d8-c28e-422e-a2bc-70407e33455f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 903.774229] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Releasing lock "refresh_cache-13aee471-4813-4376-a7bf-70f266d9a399" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 903.775836] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Updated the network info_cache for instance {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9879}} [ 903.778095] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 903.778095] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 903.778095] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 903.778095] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 903.791420] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 903.791420] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 903.791420] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 903.791420] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 903.791988] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-667bb198-01d5-4c70-bdb1-b5aac2cca8f0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 903.803610] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bd9c2b9-f27f-4ec7-b1c8-95c81378cb27 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 903.822138] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-204c47e5-5b04-40b2-977d-24e71f5d458b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 903.830777] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44256448-1ada-4ad8-95ec-24857e1271e7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 903.862233] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181737MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 903.862391] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 903.862583] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 903.922874] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 903.922874] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 903.923064] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 903.923118] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 71554abb-780c-4681-909f-8ff93712c82e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 903.923202] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 903.923317] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 13aee471-4813-4376-a7bf-70f266d9a399 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 903.923428] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 06d5ac6a-7734-46e3-80c5-d960821b7552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 903.923538] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 903.923720] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 903.923851] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 904.040749] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4624c371-fcc3-48bf-9651-516ffd1c68bf {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 904.049633] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a3b4555-ed9d-40f0-a7f1-95f89f8c63e5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 904.083272] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d98845ff-c53b-45ca-b4d1-86360777ff30 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 904.094028] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be95032f-af05-4021-bde9-7ac5488c9e32 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 904.118066] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 904.130432] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 904.147757] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 904.147953] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.285s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 904.804883] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 904.805178] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 908.695515] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "2e622c9d-369c-4c36-a477-3237bea4cf7c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.695773] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "2e622c9d-369c-4c36-a477-3237bea4cf7c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 914.294913] env[59379]: DEBUG oslo_concurrency.lockutils [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "13aee471-4813-4376-a7bf-70f266d9a399" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 914.295207] env[59379]: DEBUG oslo_concurrency.lockutils [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "13aee471-4813-4376-a7bf-70f266d9a399" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 914.295356] env[59379]: DEBUG oslo_concurrency.lockutils [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "13aee471-4813-4376-a7bf-70f266d9a399-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 914.295524] env[59379]: DEBUG oslo_concurrency.lockutils [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "13aee471-4813-4376-a7bf-70f266d9a399-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 914.295678] env[59379]: DEBUG oslo_concurrency.lockutils [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "13aee471-4813-4376-a7bf-70f266d9a399-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 914.298203] env[59379]: INFO nova.compute.manager [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Terminating instance [ 914.300036] env[59379]: DEBUG nova.compute.manager [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 914.300249] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Powering off the VM {{(pid=59379) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 914.300793] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-f6b65d25-be20-4a38-b9a6-f58245bd7939 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.308446] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for the task: (returnval){ [ 914.308446] env[59379]: value = "task-559636" [ 914.308446] env[59379]: _type = "Task" [ 914.308446] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 914.316718] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559636, 'name': PowerOffVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 914.818125] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559636, 'name': PowerOffVM_Task, 'duration_secs': 0.183065} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 914.818433] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Powered off the VM {{(pid=59379) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 914.818620] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Volume detach. Driver type: vmdk {{(pid=59379) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 914.818797] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-140553', 'volume_id': '9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'name': 'volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '13aee471-4813-4376-a7bf-70f266d9a399', 'attached_at': '', 'detached_at': '', 'volume_id': '9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'serial': '9c19a3a5-ade9-4b1f-afd2-6e797690b152'} {{(pid=59379) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 914.819530] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5be9d6f8-1b19-4939-98bf-de8bac366ac6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.839026] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc64a738-566c-4f36-957c-a570280a77db {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.844490] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b68e024d-cb1e-4a6f-9f6a-c2db2b4b73eb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.861829] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1601ccd-6f7f-4c48-b2e0-8ac12fd95e7d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.876629] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] The volume has not been displaced from its original location: [datastore1] volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152/volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152.vmdk. No consolidation needed. {{(pid=59379) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 914.881873] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Reconfiguring VM instance instance-0000001c to detach disk 2000 {{(pid=59379) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 914.882133] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-5785fec6-0ffe-486f-bc25-8ae9ddf3a2d6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 914.899516] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for the task: (returnval){ [ 914.899516] env[59379]: value = "task-559637" [ 914.899516] env[59379]: _type = "Task" [ 914.899516] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 914.906868] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559637, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 915.409763] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559637, 'name': ReconfigVM_Task, 'duration_secs': 0.153612} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 915.410154] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Reconfigured VM instance instance-0000001c to detach disk 2000 {{(pid=59379) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 915.414611] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-1f28d6c1-4931-4516-b4e6-ab1eb6cfbb31 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.429510] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for the task: (returnval){ [ 915.429510] env[59379]: value = "task-559638" [ 915.429510] env[59379]: _type = "Task" [ 915.429510] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 915.437015] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559638, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 915.939582] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559638, 'name': ReconfigVM_Task, 'duration_secs': 0.277707} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 915.939870] env[59379]: DEBUG nova.virt.vmwareapi.volumeops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-140553', 'volume_id': '9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'name': 'volume-9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '13aee471-4813-4376-a7bf-70f266d9a399', 'attached_at': '', 'detached_at': '', 'volume_id': '9c19a3a5-ade9-4b1f-afd2-6e797690b152', 'serial': '9c19a3a5-ade9-4b1f-afd2-6e797690b152'} {{(pid=59379) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 915.940184] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 915.940916] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68ba1f45-939a-4c4b-aeb2-4fbeb7ac6460 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 915.947279] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 915.947423] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2b63f190-13df-4dc5-a972-04a581eb10d3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 916.007733] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 916.007983] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 916.008180] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Deleting the datastore file [datastore1] 13aee471-4813-4376-a7bf-70f266d9a399 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 916.008426] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-db6cc8f6-6265-4ac5-88ef-9c40bc173d78 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 916.014461] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for the task: (returnval){ [ 916.014461] env[59379]: value = "task-559640" [ 916.014461] env[59379]: _type = "Task" [ 916.014461] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 916.021966] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559640, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 916.525625] env[59379]: DEBUG oslo_vmware.api [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Task: {'id': task-559640, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074872} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 916.525935] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 916.526029] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 916.526199] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 916.526359] env[59379]: INFO nova.compute.manager [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Took 2.23 seconds to destroy the instance on the hypervisor. [ 916.526580] env[59379]: DEBUG oslo.service.loopingcall [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 916.526745] env[59379]: DEBUG nova.compute.manager [-] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 916.526835] env[59379]: DEBUG nova.network.neutron [-] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 917.311340] env[59379]: DEBUG nova.network.neutron [-] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 917.346926] env[59379]: INFO nova.compute.manager [-] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Took 0.82 seconds to deallocate network for instance. [ 917.353894] env[59379]: DEBUG nova.compute.manager [req-dffe0f1b-6bcb-4ff4-8858-1288c25c3403 req-801d80ca-2614-424c-a075-91c3972a8448 service nova] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Received event network-vif-deleted-e2da95d8-c28e-422e-a2bc-70407e33455f {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 917.403342] env[59379]: INFO nova.compute.manager [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Took 0.06 seconds to detach 1 volumes for instance. [ 917.405926] env[59379]: DEBUG nova.compute.manager [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Deleting volume: 9c19a3a5-ade9-4b1f-afd2-6e797690b152 {{(pid=59379) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3217}} [ 917.495255] env[59379]: DEBUG oslo_concurrency.lockutils [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 917.496072] env[59379]: DEBUG oslo_concurrency.lockutils [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 917.496072] env[59379]: DEBUG nova.objects.instance [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lazy-loading 'resources' on Instance uuid 13aee471-4813-4376-a7bf-70f266d9a399 {{(pid=59379) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 917.630899] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ebad04b-e34d-424a-a57b-09bdbd941426 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.638028] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46492615-9298-42df-aa9e-0589dbcf1dd1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.667811] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52619507-9115-45f1-8784-fcbb2a1b882d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.677631] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c999cce-95bd-4270-bb80-a554452014f0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 917.690574] env[59379]: DEBUG nova.compute.provider_tree [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 917.699027] env[59379]: DEBUG nova.scheduler.client.report [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 917.714199] env[59379]: DEBUG oslo_concurrency.lockutils [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.219s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 917.731214] env[59379]: INFO nova.scheduler.client.report [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Deleted allocations for instance 13aee471-4813-4376-a7bf-70f266d9a399 [ 917.773077] env[59379]: DEBUG oslo_concurrency.lockutils [None req-d030dd6b-a9e3-4daa-9856-6012c6d3ba5e tempest-ServersTestBootFromVolume-1547622611 tempest-ServersTestBootFromVolume-1547622611-project-member] Lock "13aee471-4813-4376-a7bf-70f266d9a399" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.478s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.666031] env[59379]: WARNING oslo_vmware.rw_handles [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 929.666031] env[59379]: ERROR oslo_vmware.rw_handles [ 929.666031] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/a0d0e2a9-9fba-4293-ace0-a43db3d5731b/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 929.666761] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 929.666808] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Copying Virtual Disk [datastore2] vmware_temp/a0d0e2a9-9fba-4293-ace0-a43db3d5731b/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/a0d0e2a9-9fba-4293-ace0-a43db3d5731b/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 929.667211] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a1da42bb-99be-4c8e-9c90-07052e9f6a90 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.675385] env[59379]: DEBUG oslo_vmware.api [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Waiting for the task: (returnval){ [ 929.675385] env[59379]: value = "task-559642" [ 929.675385] env[59379]: _type = "Task" [ 929.675385] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 929.683766] env[59379]: DEBUG oslo_vmware.api [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Task: {'id': task-559642, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 930.187106] env[59379]: DEBUG oslo_vmware.exceptions [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 930.187106] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 930.187384] env[59379]: ERROR nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 930.187384] env[59379]: Faults: ['InvalidArgument'] [ 930.187384] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Traceback (most recent call last): [ 930.187384] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 930.187384] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] yield resources [ 930.187384] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 930.187384] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] self.driver.spawn(context, instance, image_meta, [ 930.187384] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 930.187384] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 930.187384] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 930.187384] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] self._fetch_image_if_missing(context, vi) [ 930.187384] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] image_cache(vi, tmp_image_ds_loc) [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] vm_util.copy_virtual_disk( [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] session._wait_for_task(vmdk_copy_task) [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] return self.wait_for_task(task_ref) [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] return evt.wait() [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] result = hub.switch() [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 930.187712] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] return self.greenlet.switch() [ 930.188104] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 930.188104] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] self.f(*self.args, **self.kw) [ 930.188104] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 930.188104] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] raise exceptions.translate_fault(task_info.error) [ 930.188104] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 930.188104] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Faults: ['InvalidArgument'] [ 930.188104] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] [ 930.188104] env[59379]: INFO nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Terminating instance [ 930.189167] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 930.189372] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 930.189613] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-68e54a74-3993-4282-a8f6-3dbb92d9a584 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.191936] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 930.192200] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 930.192879] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-523d32cd-7f61-4336-88d8-ad18f3fffb42 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.199243] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 930.199450] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3df36212-8520-4ca0-af55-59a2d34963da {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.201432] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 930.201594] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 930.202566] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-71da910a-9ab7-4b55-94aa-9f57674ad2a5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.207425] env[59379]: DEBUG oslo_vmware.api [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Waiting for the task: (returnval){ [ 930.207425] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52950ddf-7e1f-2ac5-152c-66a071b4db47" [ 930.207425] env[59379]: _type = "Task" [ 930.207425] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 930.217707] env[59379]: DEBUG oslo_vmware.api [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52950ddf-7e1f-2ac5-152c-66a071b4db47, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 930.271272] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 930.271469] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 930.271643] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Deleting the datastore file [datastore2] 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 930.271883] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-23a6996d-36f0-46c2-b46b-e1d6b8ea32a0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.278307] env[59379]: DEBUG oslo_vmware.api [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Waiting for the task: (returnval){ [ 930.278307] env[59379]: value = "task-559644" [ 930.278307] env[59379]: _type = "Task" [ 930.278307] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 930.285864] env[59379]: DEBUG oslo_vmware.api [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Task: {'id': task-559644, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 930.717978] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 930.718402] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Creating directory with path [datastore2] vmware_temp/7b0282d7-918c-417c-9b62-33a1f9d99823/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 930.718592] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-72cfae51-1e34-4f32-be2f-41dfcf7fdd44 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.731986] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Created directory with path [datastore2] vmware_temp/7b0282d7-918c-417c-9b62-33a1f9d99823/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 930.732265] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Fetch image to [datastore2] vmware_temp/7b0282d7-918c-417c-9b62-33a1f9d99823/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 930.732496] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/7b0282d7-918c-417c-9b62-33a1f9d99823/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 930.733286] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cc7e6cc-9c49-40c0-8533-8ae26f9b1ccb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.740178] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69639820-7a62-40f0-997c-50d3ae411d32 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.749143] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acfb2504-b5bd-4193-9d02-1f03a748f913 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.789588] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f389392-04cc-4300-9b7b-b4b0c1d07e74 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.796887] env[59379]: DEBUG oslo_vmware.api [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Task: {'id': task-559644, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072503} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 930.798476] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 930.798665] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 930.798829] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 930.798998] env[59379]: INFO nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Took 0.61 seconds to destroy the instance on the hypervisor. [ 930.800844] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-75e7409d-fa9d-4798-bb09-d4f56c0eb7c7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.802833] env[59379]: DEBUG nova.compute.claims [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 930.803008] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 930.803223] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 930.825307] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 930.879521] env[59379]: DEBUG oslo_vmware.rw_handles [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b0282d7-918c-417c-9b62-33a1f9d99823/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 930.942416] env[59379]: DEBUG oslo_vmware.rw_handles [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 930.942570] env[59379]: DEBUG oslo_vmware.rw_handles [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b0282d7-918c-417c-9b62-33a1f9d99823/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 931.009771] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff3be486-b7fe-4599-abf6-96341dd1b89e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.018160] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d26c25ce-3b3d-4449-bf1d-361f57a60739 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.048234] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c90c9f6a-d234-4e89-ac0c-3578885c73af {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.055405] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3373585f-97a2-45a1-b4d1-5e9d00998cf3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.068940] env[59379]: DEBUG nova.compute.provider_tree [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 931.077189] env[59379]: DEBUG nova.scheduler.client.report [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 931.091247] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.288s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.091782] env[59379]: ERROR nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 931.091782] env[59379]: Faults: ['InvalidArgument'] [ 931.091782] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Traceback (most recent call last): [ 931.091782] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 931.091782] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] self.driver.spawn(context, instance, image_meta, [ 931.091782] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 931.091782] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 931.091782] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 931.091782] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] self._fetch_image_if_missing(context, vi) [ 931.091782] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 931.091782] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] image_cache(vi, tmp_image_ds_loc) [ 931.091782] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] vm_util.copy_virtual_disk( [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] session._wait_for_task(vmdk_copy_task) [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] return self.wait_for_task(task_ref) [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] return evt.wait() [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] result = hub.switch() [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] return self.greenlet.switch() [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 931.092277] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] self.f(*self.args, **self.kw) [ 931.092762] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 931.092762] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] raise exceptions.translate_fault(task_info.error) [ 931.092762] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 931.092762] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Faults: ['InvalidArgument'] [ 931.092762] env[59379]: ERROR nova.compute.manager [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] [ 931.092762] env[59379]: DEBUG nova.compute.utils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 931.093828] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Build of instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd was re-scheduled: A specified parameter was not correct: fileType [ 931.093828] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 931.094200] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 931.094365] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 931.094526] env[59379]: DEBUG nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 931.094678] env[59379]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 931.358987] env[59379]: DEBUG nova.network.neutron [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 931.371047] env[59379]: INFO nova.compute.manager [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Took 0.28 seconds to deallocate network for instance. [ 931.454733] env[59379]: INFO nova.scheduler.client.report [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Deleted allocations for instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd [ 931.473200] env[59379]: DEBUG oslo_concurrency.lockutils [None req-437bacfc-60af-41f1-9793-2876ddd7a707 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 340.250s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.474348] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 337.627s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.474536] env[59379]: INFO nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] During sync_power_state the instance has a pending task (spawning). Skip. [ 931.474702] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.475452] env[59379]: DEBUG oslo_concurrency.lockutils [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 138.663s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.475696] env[59379]: DEBUG oslo_concurrency.lockutils [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Acquiring lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 931.475894] env[59379]: DEBUG oslo_concurrency.lockutils [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.476065] env[59379]: DEBUG oslo_concurrency.lockutils [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.477841] env[59379]: INFO nova.compute.manager [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Terminating instance [ 931.479504] env[59379]: DEBUG nova.compute.manager [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 931.479684] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 931.479929] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-410cc23c-e119-44df-9c4f-43dab2c0cddf {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.489936] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94ca7f5c-ae59-43e6-bc2a-437b14315acb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.500403] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 931.519485] env[59379]: WARNING nova.virt.vmwareapi.vmops [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd could not be found. [ 931.519662] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 931.519828] env[59379]: INFO nova.compute.manager [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 931.520097] env[59379]: DEBUG oslo.service.loopingcall [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 931.520316] env[59379]: DEBUG nova.compute.manager [-] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 931.520405] env[59379]: DEBUG nova.network.neutron [-] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 931.542742] env[59379]: DEBUG nova.network.neutron [-] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 931.563119] env[59379]: INFO nova.compute.manager [-] [instance: 91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd] Took 0.04 seconds to deallocate network for instance. [ 931.578628] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 931.578854] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.580328] env[59379]: INFO nova.compute.claims [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 931.644557] env[59379]: DEBUG oslo_concurrency.lockutils [None req-10bb9d9e-43e6-4b36-8c23-46a06dcf5aa2 tempest-FloatingIPsAssociationNegativeTestJSON-180034896 tempest-FloatingIPsAssociationNegativeTestJSON-180034896-project-member] Lock "91dfb2c2-4cec-40ef-a6cf-ccb65fedaefd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.707710] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad3f4f1e-6ad9-46e6-89ed-2d3af7570ac8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.715549] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be15d3d4-6e04-4dd6-819b-9d9c30acfac5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.747164] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d4c9f64-789e-40fb-b1ac-80ea28d8499e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.753966] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74a4afbf-f276-4266-9f8e-095617333de6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.766945] env[59379]: DEBUG nova.compute.provider_tree [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 931.775035] env[59379]: DEBUG nova.scheduler.client.report [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 931.789614] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.790160] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 931.823182] env[59379]: DEBUG nova.compute.utils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 931.824600] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 931.824766] env[59379]: DEBUG nova.network.neutron [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 931.833330] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 931.893340] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 931.898129] env[59379]: DEBUG nova.policy [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c5725815ee2c4c67bc5cdc3384140761', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cc08c67065e0450e87f01130f1571b3f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 931.915273] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 931.915497] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 931.915645] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 931.915816] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 931.915952] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 931.916103] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 931.916297] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 931.916446] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 931.916601] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 931.916757] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 931.916921] env[59379]: DEBUG nova.virt.hardware [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 931.917741] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3e47c40-6d2e-4682-a05c-425177f1f5c4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.925812] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50343c77-6fc7-441a-a14e-be43dface138 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.648756] env[59379]: DEBUG nova.network.neutron [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Successfully created port: 780fb21b-08f6-490a-9550-88ae379b00bc {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 933.752344] env[59379]: DEBUG nova.compute.manager [req-d42c579a-61cb-432b-b0cf-c9c8d87eb96f req-b6433423-f284-4632-9d7b-7faf8226f0b3 service nova] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Received event network-vif-plugged-780fb21b-08f6-490a-9550-88ae379b00bc {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 933.752612] env[59379]: DEBUG oslo_concurrency.lockutils [req-d42c579a-61cb-432b-b0cf-c9c8d87eb96f req-b6433423-f284-4632-9d7b-7faf8226f0b3 service nova] Acquiring lock "2e622c9d-369c-4c36-a477-3237bea4cf7c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.752741] env[59379]: DEBUG oslo_concurrency.lockutils [req-d42c579a-61cb-432b-b0cf-c9c8d87eb96f req-b6433423-f284-4632-9d7b-7faf8226f0b3 service nova] Lock "2e622c9d-369c-4c36-a477-3237bea4cf7c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.752894] env[59379]: DEBUG oslo_concurrency.lockutils [req-d42c579a-61cb-432b-b0cf-c9c8d87eb96f req-b6433423-f284-4632-9d7b-7faf8226f0b3 service nova] Lock "2e622c9d-369c-4c36-a477-3237bea4cf7c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 933.753091] env[59379]: DEBUG nova.compute.manager [req-d42c579a-61cb-432b-b0cf-c9c8d87eb96f req-b6433423-f284-4632-9d7b-7faf8226f0b3 service nova] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] No waiting events found dispatching network-vif-plugged-780fb21b-08f6-490a-9550-88ae379b00bc {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 933.753437] env[59379]: WARNING nova.compute.manager [req-d42c579a-61cb-432b-b0cf-c9c8d87eb96f req-b6433423-f284-4632-9d7b-7faf8226f0b3 service nova] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Received unexpected event network-vif-plugged-780fb21b-08f6-490a-9550-88ae379b00bc for instance with vm_state building and task_state spawning. [ 933.860512] env[59379]: DEBUG nova.network.neutron [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Successfully updated port: 780fb21b-08f6-490a-9550-88ae379b00bc {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 933.876231] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "refresh_cache-2e622c9d-369c-4c36-a477-3237bea4cf7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 933.876372] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquired lock "refresh_cache-2e622c9d-369c-4c36-a477-3237bea4cf7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 933.876557] env[59379]: DEBUG nova.network.neutron [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 933.911131] env[59379]: DEBUG nova.network.neutron [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 934.075369] env[59379]: DEBUG nova.network.neutron [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Updating instance_info_cache with network_info: [{"id": "780fb21b-08f6-490a-9550-88ae379b00bc", "address": "fa:16:3e:f6:9a:41", "network": {"id": "f2381075-9072-4f6f-8a2f-20dae5516eae", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1918712621-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cc08c67065e0450e87f01130f1571b3f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0734cc4-5718-45e2-9f98-0ded96880bef", "external-id": "nsx-vlan-transportzone-875", "segmentation_id": 875, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap780fb21b-08", "ovs_interfaceid": "780fb21b-08f6-490a-9550-88ae379b00bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 934.091340] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Releasing lock "refresh_cache-2e622c9d-369c-4c36-a477-3237bea4cf7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 934.091691] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Instance network_info: |[{"id": "780fb21b-08f6-490a-9550-88ae379b00bc", "address": "fa:16:3e:f6:9a:41", "network": {"id": "f2381075-9072-4f6f-8a2f-20dae5516eae", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1918712621-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cc08c67065e0450e87f01130f1571b3f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0734cc4-5718-45e2-9f98-0ded96880bef", "external-id": "nsx-vlan-transportzone-875", "segmentation_id": 875, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap780fb21b-08", "ovs_interfaceid": "780fb21b-08f6-490a-9550-88ae379b00bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 934.091991] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f6:9a:41', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a0734cc4-5718-45e2-9f98-0ded96880bef', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '780fb21b-08f6-490a-9550-88ae379b00bc', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 934.099329] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Creating folder: Project (cc08c67065e0450e87f01130f1571b3f). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 934.099856] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a0614c12-289e-475f-a114-a601ee1e6f2e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.114532] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Created folder: Project (cc08c67065e0450e87f01130f1571b3f) in parent group-v140509. [ 934.114717] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Creating folder: Instances. Parent ref: group-v140580. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 934.114932] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e9dc998b-0a60-45b6-91f4-4230a275f20d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.126310] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Created folder: Instances in parent group-v140580. [ 934.126549] env[59379]: DEBUG oslo.service.loopingcall [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 934.126723] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 934.126908] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1bf053b8-c5d9-4cb0-9e56-0e4de9b87d32 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.146850] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 934.146850] env[59379]: value = "task-559647" [ 934.146850] env[59379]: _type = "Task" [ 934.146850] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 934.155060] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559647, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 934.656388] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559647, 'name': CreateVM_Task, 'duration_secs': 0.323019} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 934.656536] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 934.657219] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 934.657373] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 934.657701] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 934.657961] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-df295697-7644-471a-8bfa-6a16612a27f9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 934.662409] env[59379]: DEBUG oslo_vmware.api [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Waiting for the task: (returnval){ [ 934.662409] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52f78a04-a351-60a5-0ccf-81c5cd0df49b" [ 934.662409] env[59379]: _type = "Task" [ 934.662409] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 934.669458] env[59379]: DEBUG oslo_vmware.api [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52f78a04-a351-60a5-0ccf-81c5cd0df49b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 935.176588] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 935.176953] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 935.177408] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 935.782895] env[59379]: DEBUG nova.compute.manager [req-18fe2fe9-0c68-4014-b9b1-cf3760e8589e req-6e1eceed-fc01-445a-9ab1-89b46454b48d service nova] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Received event network-changed-780fb21b-08f6-490a-9550-88ae379b00bc {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 935.783104] env[59379]: DEBUG nova.compute.manager [req-18fe2fe9-0c68-4014-b9b1-cf3760e8589e req-6e1eceed-fc01-445a-9ab1-89b46454b48d service nova] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Refreshing instance network info cache due to event network-changed-780fb21b-08f6-490a-9550-88ae379b00bc. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 935.783317] env[59379]: DEBUG oslo_concurrency.lockutils [req-18fe2fe9-0c68-4014-b9b1-cf3760e8589e req-6e1eceed-fc01-445a-9ab1-89b46454b48d service nova] Acquiring lock "refresh_cache-2e622c9d-369c-4c36-a477-3237bea4cf7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 935.783451] env[59379]: DEBUG oslo_concurrency.lockutils [req-18fe2fe9-0c68-4014-b9b1-cf3760e8589e req-6e1eceed-fc01-445a-9ab1-89b46454b48d service nova] Acquired lock "refresh_cache-2e622c9d-369c-4c36-a477-3237bea4cf7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 935.783879] env[59379]: DEBUG nova.network.neutron [req-18fe2fe9-0c68-4014-b9b1-cf3760e8589e req-6e1eceed-fc01-445a-9ab1-89b46454b48d service nova] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Refreshing network info cache for port 780fb21b-08f6-490a-9550-88ae379b00bc {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 936.368937] env[59379]: DEBUG nova.network.neutron [req-18fe2fe9-0c68-4014-b9b1-cf3760e8589e req-6e1eceed-fc01-445a-9ab1-89b46454b48d service nova] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Updated VIF entry in instance network info cache for port 780fb21b-08f6-490a-9550-88ae379b00bc. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 936.369308] env[59379]: DEBUG nova.network.neutron [req-18fe2fe9-0c68-4014-b9b1-cf3760e8589e req-6e1eceed-fc01-445a-9ab1-89b46454b48d service nova] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Updating instance_info_cache with network_info: [{"id": "780fb21b-08f6-490a-9550-88ae379b00bc", "address": "fa:16:3e:f6:9a:41", "network": {"id": "f2381075-9072-4f6f-8a2f-20dae5516eae", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1918712621-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cc08c67065e0450e87f01130f1571b3f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a0734cc4-5718-45e2-9f98-0ded96880bef", "external-id": "nsx-vlan-transportzone-875", "segmentation_id": 875, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap780fb21b-08", "ovs_interfaceid": "780fb21b-08f6-490a-9550-88ae379b00bc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 936.378766] env[59379]: DEBUG oslo_concurrency.lockutils [req-18fe2fe9-0c68-4014-b9b1-cf3760e8589e req-6e1eceed-fc01-445a-9ab1-89b46454b48d service nova] Releasing lock "refresh_cache-2e622c9d-369c-4c36-a477-3237bea4cf7c" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 938.624832] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "9745bc90-6927-46a9-af48-df69046dc2a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 938.625134] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "9745bc90-6927-46a9-af48-df69046dc2a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 947.737867] env[59379]: WARNING oslo_vmware.rw_handles [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 947.737867] env[59379]: ERROR oslo_vmware.rw_handles [ 947.738485] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/ee5dc068-96af-45fb-8ced-f3a158c69906/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 947.739839] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 947.740183] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Copying Virtual Disk [datastore1] vmware_temp/ee5dc068-96af-45fb-8ced-f3a158c69906/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/ee5dc068-96af-45fb-8ced-f3a158c69906/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 947.740467] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b4f90418-15f3-4e5b-a0ce-de8d4df2c500 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.748124] env[59379]: DEBUG oslo_vmware.api [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Waiting for the task: (returnval){ [ 947.748124] env[59379]: value = "task-559648" [ 947.748124] env[59379]: _type = "Task" [ 947.748124] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 947.756533] env[59379]: DEBUG oslo_vmware.api [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Task: {'id': task-559648, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.259307] env[59379]: DEBUG oslo_vmware.exceptions [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 948.259494] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 948.260073] env[59379]: ERROR nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 948.260073] env[59379]: Faults: ['InvalidArgument'] [ 948.260073] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Traceback (most recent call last): [ 948.260073] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 948.260073] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] yield resources [ 948.260073] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 948.260073] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] self.driver.spawn(context, instance, image_meta, [ 948.260073] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 948.260073] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 948.260073] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 948.260073] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] self._fetch_image_if_missing(context, vi) [ 948.260073] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] image_cache(vi, tmp_image_ds_loc) [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] vm_util.copy_virtual_disk( [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] session._wait_for_task(vmdk_copy_task) [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] return self.wait_for_task(task_ref) [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] return evt.wait() [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] result = hub.switch() [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 948.260478] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] return self.greenlet.switch() [ 948.260819] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 948.260819] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] self.f(*self.args, **self.kw) [ 948.260819] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 948.260819] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] raise exceptions.translate_fault(task_info.error) [ 948.260819] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 948.260819] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Faults: ['InvalidArgument'] [ 948.260819] env[59379]: ERROR nova.compute.manager [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] [ 948.260819] env[59379]: INFO nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Terminating instance [ 948.261896] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 948.262110] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 948.262339] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ebaf0a0a-62c9-4c7f-9545-0a97f75c06cc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.264417] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 948.264604] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 948.265292] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a79b2a82-27d4-4ff4-af16-8ff770ab2111 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.273042] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 948.273235] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4c0d532a-dfa9-401e-81eb-28fbbe16096d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.275290] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 948.275451] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 948.276341] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d94ada42-715d-479b-a50f-ccb5f92102be {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.281410] env[59379]: DEBUG oslo_vmware.api [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Waiting for the task: (returnval){ [ 948.281410] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]524d1ba9-c388-ed8f-a774-052bca559f80" [ 948.281410] env[59379]: _type = "Task" [ 948.281410] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.289159] env[59379]: DEBUG oslo_vmware.api [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]524d1ba9-c388-ed8f-a774-052bca559f80, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.346221] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 948.346546] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 948.346774] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Deleting the datastore file [datastore1] 03742e11-0fb2-48e2-9093-77ea7b647bf3 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 948.347042] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8f5a19c1-25f1-42dc-b590-c1ccfee9fb3c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.353639] env[59379]: DEBUG oslo_vmware.api [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Waiting for the task: (returnval){ [ 948.353639] env[59379]: value = "task-559650" [ 948.353639] env[59379]: _type = "Task" [ 948.353639] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 948.361289] env[59379]: DEBUG oslo_vmware.api [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Task: {'id': task-559650, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 948.791948] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 948.792308] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Creating directory with path [datastore1] vmware_temp/a530d4bc-7199-4896-ab23-6434b27a0c00/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 948.792436] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f37c159f-e493-4cec-af40-bd5917188684 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.803451] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Created directory with path [datastore1] vmware_temp/a530d4bc-7199-4896-ab23-6434b27a0c00/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 948.803914] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Fetch image to [datastore1] vmware_temp/a530d4bc-7199-4896-ab23-6434b27a0c00/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 948.803914] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/a530d4bc-7199-4896-ab23-6434b27a0c00/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 948.804497] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afe40710-6f2c-417a-a329-3588e2ce8ba1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.810870] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8f2ddce-9a4d-41dd-9aa6-7f2a2591d0f3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.819762] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf1ae62e-37f7-41fa-8d59-295456e2585a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.850513] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22a9bbba-f033-4ddb-8605-68704efca324 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.858605] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-20b72082-0533-455a-9867-f14962c67c1b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.862798] env[59379]: DEBUG oslo_vmware.api [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Task: {'id': task-559650, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067494} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 948.863321] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 948.863499] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 948.863687] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 948.863874] env[59379]: INFO nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 948.865877] env[59379]: DEBUG nova.compute.claims [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 948.866052] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.866252] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.881236] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 948.892150] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.893033] env[59379]: DEBUG nova.compute.utils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Instance 03742e11-0fb2-48e2-9093-77ea7b647bf3 could not be found. {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 948.897320] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Instance disappeared during build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 948.897320] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 948.897320] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 948.897320] env[59379]: DEBUG nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 948.897320] env[59379]: DEBUG nova.network.neutron [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 948.926635] env[59379]: DEBUG oslo_vmware.rw_handles [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a530d4bc-7199-4896-ab23-6434b27a0c00/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 948.928492] env[59379]: DEBUG nova.network.neutron [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 948.979544] env[59379]: INFO nova.compute.manager [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Took 0.08 seconds to deallocate network for instance. [ 948.983986] env[59379]: DEBUG oslo_vmware.rw_handles [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 948.984198] env[59379]: DEBUG oslo_vmware.rw_handles [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a530d4bc-7199-4896-ab23-6434b27a0c00/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 949.023829] env[59379]: DEBUG oslo_concurrency.lockutils [None req-68be1b59-24b9-4b71-afba-c46bf9c469a2 tempest-ServerTagsTestJSON-2100948233 tempest-ServerTagsTestJSON-2100948233-project-member] Lock "03742e11-0fb2-48e2-9093-77ea7b647bf3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 276.814s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.031884] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 949.076737] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 949.077188] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 949.078679] env[59379]: INFO nova.compute.claims [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 949.223061] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f85aa06-0afc-4886-94a8-43f43da8c978 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.230996] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4420ffc0-9297-4a07-9cd9-2229ceb54625 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.260521] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3202e8f-bc39-4475-b017-ee0082f99df3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.267421] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b728a2e-0942-4cbb-bdcb-79507588ad5d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.280150] env[59379]: DEBUG nova.compute.provider_tree [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 949.288607] env[59379]: DEBUG nova.scheduler.client.report [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 949.303311] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 949.303764] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 949.335247] env[59379]: DEBUG nova.compute.utils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 949.336563] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 949.336730] env[59379]: DEBUG nova.network.neutron [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 949.344583] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 949.391323] env[59379]: DEBUG nova.policy [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '578f85992eb84f0fb6aca0e5e23bdd06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ea06d39d80ab4c7db76925f3550795fa', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 949.405338] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 949.426473] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 949.426700] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 949.426848] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 949.427032] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 949.427174] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 949.427346] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 949.427553] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 949.427704] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 949.427858] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 949.428027] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 949.428193] env[59379]: DEBUG nova.virt.hardware [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 949.428995] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a034aeb4-883c-4608-96ba-2ae91a1c7fcd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.436504] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f61f238f-7f2a-434e-8656-dade03b5b1bd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.661201] env[59379]: DEBUG nova.network.neutron [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Successfully created port: 4ca1c8c8-9412-4f33-85e0-c657fee7af8c {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 950.449451] env[59379]: DEBUG nova.network.neutron [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Successfully updated port: 4ca1c8c8-9412-4f33-85e0-c657fee7af8c {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 950.462819] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "refresh_cache-9745bc90-6927-46a9-af48-df69046dc2a2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.462819] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquired lock "refresh_cache-9745bc90-6927-46a9-af48-df69046dc2a2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 950.462819] env[59379]: DEBUG nova.network.neutron [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 950.510126] env[59379]: DEBUG nova.network.neutron [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 950.537927] env[59379]: DEBUG nova.compute.manager [req-ba7ddace-3158-4e07-b10b-dc1d078636b7 req-da21f078-bf4d-4f61-a04f-9a796971a816 service nova] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Received event network-vif-plugged-4ca1c8c8-9412-4f33-85e0-c657fee7af8c {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 950.538155] env[59379]: DEBUG oslo_concurrency.lockutils [req-ba7ddace-3158-4e07-b10b-dc1d078636b7 req-da21f078-bf4d-4f61-a04f-9a796971a816 service nova] Acquiring lock "9745bc90-6927-46a9-af48-df69046dc2a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 950.538409] env[59379]: DEBUG oslo_concurrency.lockutils [req-ba7ddace-3158-4e07-b10b-dc1d078636b7 req-da21f078-bf4d-4f61-a04f-9a796971a816 service nova] Lock "9745bc90-6927-46a9-af48-df69046dc2a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 950.538515] env[59379]: DEBUG oslo_concurrency.lockutils [req-ba7ddace-3158-4e07-b10b-dc1d078636b7 req-da21f078-bf4d-4f61-a04f-9a796971a816 service nova] Lock "9745bc90-6927-46a9-af48-df69046dc2a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 950.538665] env[59379]: DEBUG nova.compute.manager [req-ba7ddace-3158-4e07-b10b-dc1d078636b7 req-da21f078-bf4d-4f61-a04f-9a796971a816 service nova] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] No waiting events found dispatching network-vif-plugged-4ca1c8c8-9412-4f33-85e0-c657fee7af8c {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 950.538816] env[59379]: WARNING nova.compute.manager [req-ba7ddace-3158-4e07-b10b-dc1d078636b7 req-da21f078-bf4d-4f61-a04f-9a796971a816 service nova] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Received unexpected event network-vif-plugged-4ca1c8c8-9412-4f33-85e0-c657fee7af8c for instance with vm_state building and task_state spawning. [ 950.697165] env[59379]: DEBUG nova.network.neutron [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Updating instance_info_cache with network_info: [{"id": "4ca1c8c8-9412-4f33-85e0-c657fee7af8c", "address": "fa:16:3e:38:bd:bb", "network": {"id": "76ec8873-e92a-4256-b9c8-0161bdc70330", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-351149930-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ea06d39d80ab4c7db76925f3550795fa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e6db039c-542c-4544-a57d-ddcc6c1e8e45", "external-id": "nsx-vlan-transportzone-810", "segmentation_id": 810, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4ca1c8c8-94", "ovs_interfaceid": "4ca1c8c8-9412-4f33-85e0-c657fee7af8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.707314] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Releasing lock "refresh_cache-9745bc90-6927-46a9-af48-df69046dc2a2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.707587] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Instance network_info: |[{"id": "4ca1c8c8-9412-4f33-85e0-c657fee7af8c", "address": "fa:16:3e:38:bd:bb", "network": {"id": "76ec8873-e92a-4256-b9c8-0161bdc70330", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-351149930-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ea06d39d80ab4c7db76925f3550795fa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e6db039c-542c-4544-a57d-ddcc6c1e8e45", "external-id": "nsx-vlan-transportzone-810", "segmentation_id": 810, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4ca1c8c8-94", "ovs_interfaceid": "4ca1c8c8-9412-4f33-85e0-c657fee7af8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 950.707996] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:38:bd:bb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e6db039c-542c-4544-a57d-ddcc6c1e8e45', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4ca1c8c8-9412-4f33-85e0-c657fee7af8c', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 950.715440] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Creating folder: Project (ea06d39d80ab4c7db76925f3550795fa). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 950.715914] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-68ee5cbf-bd6e-4465-a537-300bd7a46719 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.726230] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Created folder: Project (ea06d39d80ab4c7db76925f3550795fa) in parent group-v140509. [ 950.726401] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Creating folder: Instances. Parent ref: group-v140583. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 950.726600] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c00c9227-5b99-411f-a1df-2fde315e0fa0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.735126] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Created folder: Instances in parent group-v140583. [ 950.735334] env[59379]: DEBUG oslo.service.loopingcall [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 950.735499] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 950.735663] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c53b1aef-d38a-4926-b691-dfa182da89cc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.754065] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 950.754065] env[59379]: value = "task-559653" [ 950.754065] env[59379]: _type = "Task" [ 950.754065] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 950.761635] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559653, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 951.264026] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559653, 'name': CreateVM_Task, 'duration_secs': 0.292374} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 951.264199] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 951.264875] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 951.265117] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 951.265371] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 951.265606] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0ea828aa-dda4-4ac4-aa42-f78b913347d9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.269872] env[59379]: DEBUG oslo_vmware.api [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Waiting for the task: (returnval){ [ 951.269872] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52a1298e-e465-a11d-8708-9a347f345677" [ 951.269872] env[59379]: _type = "Task" [ 951.269872] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 951.277607] env[59379]: DEBUG oslo_vmware.api [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52a1298e-e465-a11d-8708-9a347f345677, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 951.779808] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 951.780154] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 951.780383] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 952.565492] env[59379]: DEBUG nova.compute.manager [req-da4f0c81-306d-44b8-ad1b-92f45b2220c0 req-a89f8d43-6c21-460b-949b-91bb343b7581 service nova] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Received event network-changed-4ca1c8c8-9412-4f33-85e0-c657fee7af8c {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 952.565683] env[59379]: DEBUG nova.compute.manager [req-da4f0c81-306d-44b8-ad1b-92f45b2220c0 req-a89f8d43-6c21-460b-949b-91bb343b7581 service nova] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Refreshing instance network info cache due to event network-changed-4ca1c8c8-9412-4f33-85e0-c657fee7af8c. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 952.565888] env[59379]: DEBUG oslo_concurrency.lockutils [req-da4f0c81-306d-44b8-ad1b-92f45b2220c0 req-a89f8d43-6c21-460b-949b-91bb343b7581 service nova] Acquiring lock "refresh_cache-9745bc90-6927-46a9-af48-df69046dc2a2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 952.566048] env[59379]: DEBUG oslo_concurrency.lockutils [req-da4f0c81-306d-44b8-ad1b-92f45b2220c0 req-a89f8d43-6c21-460b-949b-91bb343b7581 service nova] Acquired lock "refresh_cache-9745bc90-6927-46a9-af48-df69046dc2a2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 952.566237] env[59379]: DEBUG nova.network.neutron [req-da4f0c81-306d-44b8-ad1b-92f45b2220c0 req-a89f8d43-6c21-460b-949b-91bb343b7581 service nova] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Refreshing network info cache for port 4ca1c8c8-9412-4f33-85e0-c657fee7af8c {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 952.895081] env[59379]: DEBUG nova.network.neutron [req-da4f0c81-306d-44b8-ad1b-92f45b2220c0 req-a89f8d43-6c21-460b-949b-91bb343b7581 service nova] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Updated VIF entry in instance network info cache for port 4ca1c8c8-9412-4f33-85e0-c657fee7af8c. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 952.895434] env[59379]: DEBUG nova.network.neutron [req-da4f0c81-306d-44b8-ad1b-92f45b2220c0 req-a89f8d43-6c21-460b-949b-91bb343b7581 service nova] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Updating instance_info_cache with network_info: [{"id": "4ca1c8c8-9412-4f33-85e0-c657fee7af8c", "address": "fa:16:3e:38:bd:bb", "network": {"id": "76ec8873-e92a-4256-b9c8-0161bdc70330", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-351149930-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ea06d39d80ab4c7db76925f3550795fa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e6db039c-542c-4544-a57d-ddcc6c1e8e45", "external-id": "nsx-vlan-transportzone-810", "segmentation_id": 810, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4ca1c8c8-94", "ovs_interfaceid": "4ca1c8c8-9412-4f33-85e0-c657fee7af8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 952.904640] env[59379]: DEBUG oslo_concurrency.lockutils [req-da4f0c81-306d-44b8-ad1b-92f45b2220c0 req-a89f8d43-6c21-460b-949b-91bb343b7581 service nova] Releasing lock "refresh_cache-9745bc90-6927-46a9-af48-df69046dc2a2" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 959.434587] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 960.428926] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 961.433180] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 963.434261] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 963.434597] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 963.434597] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 963.452873] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 963.453096] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 963.453169] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 963.453281] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 963.453403] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 963.453523] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 963.453640] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 963.453755] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 963.453871] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Didn't find any instances for network info cache update. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 964.433964] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 964.434235] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 964.434386] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 964.444514] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 964.444724] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 964.444875] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 964.445069] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 964.446113] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d97649bb-156c-4b94-be04-7d18fbd21b75 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.455089] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaf4c4f7-8560-467d-a22d-9951ceb9afd9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.468614] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a9e1303-fe2d-4ed6-a183-4a67d59f82c0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.474538] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb99c05d-a76b-4152-99e2-02fc52553360 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.502716] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181688MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 964.502822] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 964.502970] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 964.558742] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 964.558895] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 964.559039] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 71554abb-780c-4681-909f-8ff93712c82e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 964.559192] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 964.559314] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 06d5ac6a-7734-46e3-80c5-d960821b7552 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 964.559427] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 964.559539] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2e622c9d-369c-4c36-a477-3237bea4cf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 964.559650] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 9745bc90-6927-46a9-af48-df69046dc2a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 964.559960] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 964.560061] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 964.649163] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-103efce7-8feb-4e23-9e5c-570de83f31cc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.657761] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-196ba8b4-b850-4fa9-a6d4-aa36c5eae6e3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.686715] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d670da62-11d4-48fe-89ea-edb249fcce12 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.693755] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a034363-e24b-447f-9596-ce0767ad2de8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.706206] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 964.714707] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 964.729405] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 964.729571] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 965.729487] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 965.729862] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 965.729862] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 976.393213] env[59379]: WARNING oslo_vmware.rw_handles [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 976.393213] env[59379]: ERROR oslo_vmware.rw_handles [ 976.396017] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/7b0282d7-918c-417c-9b62-33a1f9d99823/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 976.396017] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 976.396396] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Copying Virtual Disk [datastore2] vmware_temp/7b0282d7-918c-417c-9b62-33a1f9d99823/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/7b0282d7-918c-417c-9b62-33a1f9d99823/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 976.396765] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-930fceb8-9d6b-40b9-b8c5-6e676a201e47 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.404843] env[59379]: DEBUG oslo_vmware.api [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Waiting for the task: (returnval){ [ 976.404843] env[59379]: value = "task-559654" [ 976.404843] env[59379]: _type = "Task" [ 976.404843] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 976.413538] env[59379]: DEBUG oslo_vmware.api [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Task: {'id': task-559654, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 976.915737] env[59379]: DEBUG oslo_vmware.exceptions [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 976.915938] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 976.916569] env[59379]: ERROR nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 976.916569] env[59379]: Faults: ['InvalidArgument'] [ 976.916569] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Traceback (most recent call last): [ 976.916569] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 976.916569] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] yield resources [ 976.916569] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 976.916569] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] self.driver.spawn(context, instance, image_meta, [ 976.916569] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 976.916569] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 976.916569] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 976.916569] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] self._fetch_image_if_missing(context, vi) [ 976.916569] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] image_cache(vi, tmp_image_ds_loc) [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] vm_util.copy_virtual_disk( [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] session._wait_for_task(vmdk_copy_task) [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] return self.wait_for_task(task_ref) [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] return evt.wait() [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] result = hub.switch() [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 976.916885] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] return self.greenlet.switch() [ 976.917191] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 976.917191] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] self.f(*self.args, **self.kw) [ 976.917191] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 976.917191] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] raise exceptions.translate_fault(task_info.error) [ 976.917191] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 976.917191] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Faults: ['InvalidArgument'] [ 976.917191] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] [ 976.917191] env[59379]: INFO nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Terminating instance [ 976.918559] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 976.918640] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 976.919235] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 976.919416] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 976.919623] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dfbe1fd4-ad1a-4f53-bacf-9c3daf7bec7c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.922050] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0341ee96-d992-467f-a956-d25ac794bafa {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.928263] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 976.928447] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-89288664-36b9-43fd-aa59-3e57b76b4848 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.930715] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 976.930877] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 976.931526] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e965da12-eae4-4c14-9afb-2e6bde051836 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.937268] env[59379]: DEBUG oslo_vmware.api [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Waiting for the task: (returnval){ [ 976.937268] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]5296b98a-4127-b72d-d46d-6403009602ac" [ 976.937268] env[59379]: _type = "Task" [ 976.937268] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 976.944087] env[59379]: DEBUG oslo_vmware.api [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]5296b98a-4127-b72d-d46d-6403009602ac, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 976.998063] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 976.998281] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 976.998454] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Deleting the datastore file [datastore2] 50ff2169-9c1f-4f7a-b365-1949dac57f86 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 976.998754] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ac235cc8-4c3c-4f77-a1be-13f1dadbda08 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.004125] env[59379]: DEBUG oslo_vmware.api [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Waiting for the task: (returnval){ [ 977.004125] env[59379]: value = "task-559656" [ 977.004125] env[59379]: _type = "Task" [ 977.004125] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 977.011623] env[59379]: DEBUG oslo_vmware.api [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Task: {'id': task-559656, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 977.447746] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 977.448489] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Creating directory with path [datastore2] vmware_temp/d48f35ee-e267-4595-a3f2-ba1af942c754/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 977.448823] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-819cce21-27c0-448b-b9e9-3aab5b31f76e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.460438] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Created directory with path [datastore2] vmware_temp/d48f35ee-e267-4595-a3f2-ba1af942c754/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 977.460438] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Fetch image to [datastore2] vmware_temp/d48f35ee-e267-4595-a3f2-ba1af942c754/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 977.460601] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/d48f35ee-e267-4595-a3f2-ba1af942c754/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 977.461311] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37b2fc3a-c3e7-443b-8af4-55bdf0eb05d4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.468218] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c13c41f9-08b4-473e-8d53-2986f9c31213 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.476888] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94931a8b-5738-474a-9e67-25299a41068e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.508854] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddb3b5d0-0f5d-4e35-94c0-bd58fc0177fa {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.516856] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-10d41ccc-22f1-4b2c-bf2e-ce6b690e1a5c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.518448] env[59379]: DEBUG oslo_vmware.api [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Task: {'id': task-559656, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075445} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 977.518673] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 977.518843] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 977.519009] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 977.519182] env[59379]: INFO nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Took 0.60 seconds to destroy the instance on the hypervisor. [ 977.521196] env[59379]: DEBUG nova.compute.claims [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 977.521356] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 977.521553] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.539740] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 977.586962] env[59379]: DEBUG oslo_vmware.rw_handles [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d48f35ee-e267-4595-a3f2-ba1af942c754/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 977.642633] env[59379]: DEBUG oslo_vmware.rw_handles [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 977.642633] env[59379]: DEBUG oslo_vmware.rw_handles [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d48f35ee-e267-4595-a3f2-ba1af942c754/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 977.706195] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bde5de1b-2de6-413f-8a95-6025b42cf324 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.715257] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e66ac97-82d7-4dfb-bb72-720464ad2522 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.744651] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dbd364f-eacd-45fe-b328-72a6d7f00bea {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.751500] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd03dc68-8474-4fbd-b219-31ee423c9e69 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.764179] env[59379]: DEBUG nova.compute.provider_tree [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 977.775181] env[59379]: DEBUG nova.scheduler.client.report [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 977.790938] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.269s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.791472] env[59379]: ERROR nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 977.791472] env[59379]: Faults: ['InvalidArgument'] [ 977.791472] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Traceback (most recent call last): [ 977.791472] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 977.791472] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] self.driver.spawn(context, instance, image_meta, [ 977.791472] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 977.791472] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] self._vmops.spawn(context, instance, image_meta, injected_files, [ 977.791472] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 977.791472] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] self._fetch_image_if_missing(context, vi) [ 977.791472] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 977.791472] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] image_cache(vi, tmp_image_ds_loc) [ 977.791472] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] vm_util.copy_virtual_disk( [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] session._wait_for_task(vmdk_copy_task) [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] return self.wait_for_task(task_ref) [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] return evt.wait() [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] result = hub.switch() [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] return self.greenlet.switch() [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 977.791825] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] self.f(*self.args, **self.kw) [ 977.792184] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 977.792184] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] raise exceptions.translate_fault(task_info.error) [ 977.792184] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 977.792184] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Faults: ['InvalidArgument'] [ 977.792184] env[59379]: ERROR nova.compute.manager [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] [ 977.792184] env[59379]: DEBUG nova.compute.utils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 977.793661] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Build of instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 was re-scheduled: A specified parameter was not correct: fileType [ 977.793661] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 977.794025] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 977.794197] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 977.794344] env[59379]: DEBUG nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 977.794499] env[59379]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 978.040153] env[59379]: DEBUG nova.network.neutron [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 978.052818] env[59379]: INFO nova.compute.manager [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Took 0.26 seconds to deallocate network for instance. [ 978.133074] env[59379]: INFO nova.scheduler.client.report [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Deleted allocations for instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 [ 978.152162] env[59379]: DEBUG oslo_concurrency.lockutils [None req-88dcc985-5caa-4282-a716-85ce2d8552cc tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "50ff2169-9c1f-4f7a-b365-1949dac57f86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 390.730s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 978.152406] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "50ff2169-9c1f-4f7a-b365-1949dac57f86" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 384.306s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 978.152580] env[59379]: INFO nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] During sync_power_state the instance has a pending task (spawning). Skip. [ 978.152741] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "50ff2169-9c1f-4f7a-b365-1949dac57f86" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 978.152949] env[59379]: DEBUG oslo_concurrency.lockutils [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "50ff2169-9c1f-4f7a-b365-1949dac57f86" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 189.652s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 978.153169] env[59379]: DEBUG oslo_concurrency.lockutils [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Acquiring lock "50ff2169-9c1f-4f7a-b365-1949dac57f86-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 978.153360] env[59379]: DEBUG oslo_concurrency.lockutils [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "50ff2169-9c1f-4f7a-b365-1949dac57f86-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 978.153517] env[59379]: DEBUG oslo_concurrency.lockutils [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "50ff2169-9c1f-4f7a-b365-1949dac57f86-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 978.155346] env[59379]: INFO nova.compute.manager [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Terminating instance [ 978.157105] env[59379]: DEBUG nova.compute.manager [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 978.157298] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 978.157727] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-23602a86-f6a0-4b5c-9bf6-e49774fc262c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 978.167101] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-095057e5-a645-4568-8854-8987d46013d5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 978.194801] env[59379]: WARNING nova.virt.vmwareapi.vmops [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 50ff2169-9c1f-4f7a-b365-1949dac57f86 could not be found. [ 978.194990] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 978.195184] env[59379]: INFO nova.compute.manager [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Took 0.04 seconds to destroy the instance on the hypervisor. [ 978.195411] env[59379]: DEBUG oslo.service.loopingcall [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 978.195618] env[59379]: DEBUG nova.compute.manager [-] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 978.195709] env[59379]: DEBUG nova.network.neutron [-] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 978.218390] env[59379]: DEBUG nova.network.neutron [-] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 978.226466] env[59379]: INFO nova.compute.manager [-] [instance: 50ff2169-9c1f-4f7a-b365-1949dac57f86] Took 0.03 seconds to deallocate network for instance. [ 978.307577] env[59379]: DEBUG oslo_concurrency.lockutils [None req-4319ac7a-c6b0-401d-8699-fbde4ef96c01 tempest-TenantUsagesTestJSON-1443026195 tempest-TenantUsagesTestJSON-1443026195-project-member] Lock "50ff2169-9c1f-4f7a-b365-1949dac57f86" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.154s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 988.674194] env[59379]: DEBUG nova.compute.manager [req-9558a568-894e-4823-ab02-0d27791252fa req-b081a03b-9db0-4400-962c-450139b94f20 service nova] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Received event network-vif-deleted-3aabbd9f-c7ef-4867-9ac8-dfea570218c7 {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 996.734124] env[59379]: WARNING oslo_vmware.rw_handles [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 996.734124] env[59379]: ERROR oslo_vmware.rw_handles [ 996.734669] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/a530d4bc-7199-4896-ab23-6434b27a0c00/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 996.736114] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 996.736466] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Copying Virtual Disk [datastore1] vmware_temp/a530d4bc-7199-4896-ab23-6434b27a0c00/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/a530d4bc-7199-4896-ab23-6434b27a0c00/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 996.736754] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-909ca1b0-f451-489f-9093-8687f5caca40 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.744322] env[59379]: DEBUG oslo_vmware.api [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Waiting for the task: (returnval){ [ 996.744322] env[59379]: value = "task-559657" [ 996.744322] env[59379]: _type = "Task" [ 996.744322] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 996.752477] env[59379]: DEBUG oslo_vmware.api [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Task: {'id': task-559657, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 997.255409] env[59379]: DEBUG oslo_vmware.exceptions [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 997.255692] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 997.256243] env[59379]: ERROR nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 997.256243] env[59379]: Faults: ['InvalidArgument'] [ 997.256243] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Traceback (most recent call last): [ 997.256243] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 997.256243] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] yield resources [ 997.256243] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 997.256243] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self.driver.spawn(context, instance, image_meta, [ 997.256243] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 997.256243] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self._vmops.spawn(context, instance, image_meta, injected_files, [ 997.256243] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 997.256243] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self._fetch_image_if_missing(context, vi) [ 997.256243] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] image_cache(vi, tmp_image_ds_loc) [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] vm_util.copy_virtual_disk( [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] session._wait_for_task(vmdk_copy_task) [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return self.wait_for_task(task_ref) [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return evt.wait() [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] result = hub.switch() [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 997.256590] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return self.greenlet.switch() [ 997.257055] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 997.257055] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self.f(*self.args, **self.kw) [ 997.257055] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 997.257055] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] raise exceptions.translate_fault(task_info.error) [ 997.257055] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 997.257055] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Faults: ['InvalidArgument'] [ 997.257055] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 997.257055] env[59379]: INFO nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Terminating instance [ 997.258163] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 997.258299] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 997.258901] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 997.259098] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 997.259318] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ebed261e-46d4-4a31-9ddf-1c576f1e460b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.261685] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01c1abb7-7955-4219-a0ee-b89af3f84ec0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.269087] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 997.270318] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ff2e4710-149e-4590-84bf-8709f509acc8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.271858] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 997.272034] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 997.272693] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dd2a7e46-2f8a-4a11-8d72-a7d4223d33a2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.278748] env[59379]: DEBUG oslo_vmware.api [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Waiting for the task: (returnval){ [ 997.278748] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]522852e0-8b0d-732d-beae-22547ed408d3" [ 997.278748] env[59379]: _type = "Task" [ 997.278748] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 997.286552] env[59379]: DEBUG oslo_vmware.api [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]522852e0-8b0d-732d-beae-22547ed408d3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 997.349066] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 997.349331] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 997.349584] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Deleting the datastore file [datastore1] 05010bc2-c30a-49bf-8daa-3eec6a5e9022 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 997.349762] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f582c4b2-e1ff-4b8f-81bd-c88ab4748532 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.356057] env[59379]: DEBUG oslo_vmware.api [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Waiting for the task: (returnval){ [ 997.356057] env[59379]: value = "task-559659" [ 997.356057] env[59379]: _type = "Task" [ 997.356057] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 997.364213] env[59379]: DEBUG oslo_vmware.api [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Task: {'id': task-559659, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 997.790615] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 997.790903] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Creating directory with path [datastore1] vmware_temp/51c0bbe1-5d23-4341-9a45-2743a5d78008/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 997.791157] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ff697238-3841-4a12-86b9-d8c69225138f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.804652] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Created directory with path [datastore1] vmware_temp/51c0bbe1-5d23-4341-9a45-2743a5d78008/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 997.804652] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Fetch image to [datastore1] vmware_temp/51c0bbe1-5d23-4341-9a45-2743a5d78008/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 997.804822] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/51c0bbe1-5d23-4341-9a45-2743a5d78008/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 997.805617] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e637863-b4b1-4832-b75b-c9a8379e415b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.812622] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c349f0c-388d-4f3b-bb1e-c14706a729c1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.822709] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85bb0571-25ed-4c13-a72b-8692ac772265 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.857306] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-817251c8-c587-4817-b33d-ec299c567e75 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.870401] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-98620a40-9345-44bd-a55b-631186c822d8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.872069] env[59379]: DEBUG oslo_vmware.api [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Task: {'id': task-559659, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075072} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 997.872322] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 997.872494] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 997.872722] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 997.872908] env[59379]: INFO nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Took 0.61 seconds to destroy the instance on the hypervisor. [ 997.875084] env[59379]: DEBUG nova.compute.claims [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 997.875249] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 997.875447] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 997.897350] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 997.903306] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 997.904011] env[59379]: DEBUG nova.compute.utils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Instance 05010bc2-c30a-49bf-8daa-3eec6a5e9022 could not be found. {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 997.911260] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Instance disappeared during build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 997.911260] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 997.911260] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 997.911260] env[59379]: DEBUG nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 997.911431] env[59379]: DEBUG nova.network.neutron [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 997.970804] env[59379]: DEBUG oslo_vmware.rw_handles [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/51c0bbe1-5d23-4341-9a45-2743a5d78008/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 998.034565] env[59379]: DEBUG oslo_vmware.rw_handles [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 998.034665] env[59379]: DEBUG oslo_vmware.rw_handles [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/51c0bbe1-5d23-4341-9a45-2743a5d78008/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 998.319149] env[59379]: DEBUG neutronclient.v2_0.client [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59379) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 998.322765] env[59379]: ERROR nova.compute.manager [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Traceback (most recent call last): [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self.driver.spawn(context, instance, image_meta, [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self._vmops.spawn(context, instance, image_meta, injected_files, [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self._fetch_image_if_missing(context, vi) [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] image_cache(vi, tmp_image_ds_loc) [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 998.322765] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] vm_util.copy_virtual_disk( [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] session._wait_for_task(vmdk_copy_task) [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return self.wait_for_task(task_ref) [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return evt.wait() [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] result = hub.switch() [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return self.greenlet.switch() [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self.f(*self.args, **self.kw) [ 998.323137] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] raise exceptions.translate_fault(task_info.error) [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Faults: ['InvalidArgument'] [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] During handling of the above exception, another exception occurred: [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Traceback (most recent call last): [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self._build_and_run_instance(context, instance, image, [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] with excutils.save_and_reraise_exception(): [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self.force_reraise() [ 998.323458] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] raise self.value [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] with self.rt.instance_claim(context, instance, node, allocs, [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self.abort() [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self.tracker.abort_instance_claim(self.context, self.instance, [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return f(*args, **kwargs) [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self._unset_instance_host_and_node(instance) [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 998.323788] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] instance.save() [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] updates, result = self.indirection_api.object_action( [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return cctxt.call(context, 'object_action', objinst=objinst, [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] result = self.transport._send( [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return self._driver.send(target, ctxt, message, [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 998.324161] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] raise result [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] nova.exception_Remote.InstanceNotFound_Remote: Instance 05010bc2-c30a-49bf-8daa-3eec6a5e9022 could not be found. [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Traceback (most recent call last): [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return getattr(target, method)(*args, **kwargs) [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return fn(self, *args, **kwargs) [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] old_ref, inst_ref = db.instance_update_and_get_original( [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return f(*args, **kwargs) [ 998.324460] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] with excutils.save_and_reraise_exception() as ectxt: [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self.force_reraise() [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] raise self.value [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return f(*args, **kwargs) [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return f(context, *args, **kwargs) [ 998.324838] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] raise exception.InstanceNotFound(instance_id=uuid) [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] nova.exception.InstanceNotFound: Instance 05010bc2-c30a-49bf-8daa-3eec6a5e9022 could not be found. [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] During handling of the above exception, another exception occurred: [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Traceback (most recent call last): [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] ret = obj(*args, **kwargs) [ 998.325257] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] exception_handler_v20(status_code, error_body) [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] raise client_exc(message=error_message, [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Neutron server returns request_ids: ['req-ce2f30f0-b560-40a3-a0b7-64d15d0d1218'] [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] During handling of the above exception, another exception occurred: [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Traceback (most recent call last): [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self._deallocate_network(context, instance, requested_networks) [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 998.325640] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self.network_api.deallocate_for_instance( [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] data = neutron.list_ports(**search_opts) [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] ret = obj(*args, **kwargs) [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return self.list('ports', self.ports_path, retrieve_all, [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] ret = obj(*args, **kwargs) [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] for r in self._pagination(collection, path, **params): [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] res = self.get(path, params=params) [ 998.325952] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] ret = obj(*args, **kwargs) [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return self.retry_request("GET", action, body=body, [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] ret = obj(*args, **kwargs) [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] return self.do_request(method, action, body=body, [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] ret = obj(*args, **kwargs) [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] self._handle_fault_response(status_code, replybody, resp) [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 998.326472] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] raise exception.Unauthorized() [ 998.326763] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] nova.exception.Unauthorized: Not authorized. [ 998.326763] env[59379]: ERROR nova.compute.manager [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] [ 998.345354] env[59379]: DEBUG oslo_concurrency.lockutils [None req-f266e82b-4618-4f1b-aca0-2ae761f4878b tempest-InstanceActionsNegativeTestJSON-1893694906 tempest-InstanceActionsNegativeTestJSON-1893694906-project-member] Lock "05010bc2-c30a-49bf-8daa-3eec6a5e9022" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 325.173s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.435786] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1022.429432] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1023.434726] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1024.434597] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1024.434796] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1024.435062] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1024.452038] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1024.452038] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1024.452038] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1024.452038] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1024.452038] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1024.452448] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1024.452448] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Didn't find any instances for network info cache update. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1025.434382] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1025.451347] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1026.411746] env[59379]: WARNING oslo_vmware.rw_handles [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1026.411746] env[59379]: ERROR oslo_vmware.rw_handles [ 1026.412283] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/d48f35ee-e267-4595-a3f2-ba1af942c754/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1026.414048] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1026.414304] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Copying Virtual Disk [datastore2] vmware_temp/d48f35ee-e267-4595-a3f2-ba1af942c754/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/d48f35ee-e267-4595-a3f2-ba1af942c754/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1026.414572] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3caa3458-05ec-4872-bb68-3913b2730a86 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.422459] env[59379]: DEBUG oslo_vmware.api [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Waiting for the task: (returnval){ [ 1026.422459] env[59379]: value = "task-559660" [ 1026.422459] env[59379]: _type = "Task" [ 1026.422459] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.430024] env[59379]: DEBUG oslo_vmware.api [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Task: {'id': task-559660, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1026.433573] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1026.433759] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1026.433914] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1026.434076] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1026.444300] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1026.444515] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1026.444691] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1026.444854] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1026.445901] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b38ef1e-de86-40f1-9e19-c1de33103b4f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.453549] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0a2abb8-2333-4c1f-9cee-01f928e1f3a6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.467340] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bff81469-7475-48eb-964a-1a59c5074701 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.473462] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cff3c48f-2959-46e9-a0c3-3d71b78359cb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.502051] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181727MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1026.502192] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1026.502379] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1026.556865] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1026.557086] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 71554abb-780c-4681-909f-8ff93712c82e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1026.557223] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1026.557344] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1026.557460] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2e622c9d-369c-4c36-a477-3237bea4cf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1026.557572] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 9745bc90-6927-46a9-af48-df69046dc2a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1026.557755] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1026.557889] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1026.572990] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Refreshing inventories for resource provider 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1026.583907] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Updating ProviderTree inventory for provider 693f1d2b-e627-44fb-bcd5-714cccac894b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1026.584115] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Updating inventory in ProviderTree for provider 693f1d2b-e627-44fb-bcd5-714cccac894b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1026.593427] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Refreshing aggregate associations for resource provider 693f1d2b-e627-44fb-bcd5-714cccac894b, aggregates: None {{(pid=59379) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1026.608951] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Refreshing trait associations for resource provider 693f1d2b-e627-44fb-bcd5-714cccac894b, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=59379) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1026.677321] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b429a94-34c8-4ab3-8399-409843bc5242 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.684917] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-911999c0-0fb0-456c-b63f-00f0f98aa613 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.715453] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23643078-fa83-4cfa-a34d-3b3301a774ab {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.722528] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5866c5c-fb8c-4899-919e-0b44540d8bc2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.735396] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1026.743628] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1026.756728] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1026.756848] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.254s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1026.932542] env[59379]: DEBUG oslo_vmware.exceptions [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1026.932773] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1026.933356] env[59379]: ERROR nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1026.933356] env[59379]: Faults: ['InvalidArgument'] [ 1026.933356] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Traceback (most recent call last): [ 1026.933356] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1026.933356] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] yield resources [ 1026.933356] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1026.933356] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] self.driver.spawn(context, instance, image_meta, [ 1026.933356] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1026.933356] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1026.933356] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1026.933356] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] self._fetch_image_if_missing(context, vi) [ 1026.933356] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] image_cache(vi, tmp_image_ds_loc) [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] vm_util.copy_virtual_disk( [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] session._wait_for_task(vmdk_copy_task) [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] return self.wait_for_task(task_ref) [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] return evt.wait() [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] result = hub.switch() [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1026.933714] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] return self.greenlet.switch() [ 1026.934044] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1026.934044] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] self.f(*self.args, **self.kw) [ 1026.934044] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1026.934044] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] raise exceptions.translate_fault(task_info.error) [ 1026.934044] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1026.934044] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Faults: ['InvalidArgument'] [ 1026.934044] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] [ 1026.934044] env[59379]: INFO nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Terminating instance [ 1026.935131] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1026.935335] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1026.935926] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1026.936119] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1026.936330] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-af0f5bc1-48c3-4c22-a604-9cea093338e9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.938743] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32be710f-3ba4-40ca-9bb4-67ba451919fc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.945149] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1026.945339] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0ddbc2c5-8702-4103-891b-cad0af0636e8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.947376] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1026.947538] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1026.948461] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-191ba2ba-bd11-47a0-a73d-8c9dce067a2a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.953040] env[59379]: DEBUG oslo_vmware.api [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Waiting for the task: (returnval){ [ 1026.953040] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]524dddd7-abd6-b1bb-722d-ca250cbda4b4" [ 1026.953040] env[59379]: _type = "Task" [ 1026.953040] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.960521] env[59379]: DEBUG oslo_vmware.api [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]524dddd7-abd6-b1bb-722d-ca250cbda4b4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.018101] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1027.018101] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1027.018101] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Deleting the datastore file [datastore2] 2545ca35-7a3f-47ed-b0de-e1bb26967379 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1027.018325] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0cd5b76a-32b1-4fb3-b5ee-6ea9f129a944 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.023896] env[59379]: DEBUG oslo_vmware.api [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Waiting for the task: (returnval){ [ 1027.023896] env[59379]: value = "task-559662" [ 1027.023896] env[59379]: _type = "Task" [ 1027.023896] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1027.031331] env[59379]: DEBUG oslo_vmware.api [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Task: {'id': task-559662, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.463299] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1027.463638] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Creating directory with path [datastore2] vmware_temp/34ecb659-2763-4806-a7b3-ab13dc098d31/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1027.463755] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-935b47cc-8947-4b76-a1fc-4fb7d195c774 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.475289] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Created directory with path [datastore2] vmware_temp/34ecb659-2763-4806-a7b3-ab13dc098d31/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1027.475472] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Fetch image to [datastore2] vmware_temp/34ecb659-2763-4806-a7b3-ab13dc098d31/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1027.475628] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/34ecb659-2763-4806-a7b3-ab13dc098d31/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1027.476366] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05fdefaa-c424-4040-b00e-32177bd969cf {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.484484] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b1e1e18-68f1-468c-af7d-2946a751a8d7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.493450] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01a96fb6-b463-4be3-b0e0-ffcea9531e96 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.524198] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-780d3943-d882-475b-93ae-7bbe723313b1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.535125] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3931ee04-fa16-4316-bbaf-704813b71418 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.536841] env[59379]: DEBUG oslo_vmware.api [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Task: {'id': task-559662, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063729} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1027.537081] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1027.537257] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1027.537420] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1027.537592] env[59379]: INFO nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1027.539714] env[59379]: DEBUG nova.compute.claims [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1027.539867] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1027.540143] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1027.556993] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1027.605836] env[59379]: DEBUG oslo_vmware.rw_handles [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/34ecb659-2763-4806-a7b3-ab13dc098d31/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1027.661965] env[59379]: DEBUG oslo_vmware.rw_handles [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1027.662151] env[59379]: DEBUG oslo_vmware.rw_handles [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/34ecb659-2763-4806-a7b3-ab13dc098d31/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1027.701014] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b895c06f-b20e-49b8-bacb-949b18d3acd6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.708344] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b5163b3-d9d6-44ca-b806-afeb5de96efd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.739034] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02b00b7d-ae80-419d-9c53-a3d73ee4c4f4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.745801] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f2aa69c-98b7-47ae-a15d-7ba99ea7e770 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.758790] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1027.759220] env[59379]: DEBUG nova.compute.provider_tree [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1027.767838] env[59379]: DEBUG nova.scheduler.client.report [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1027.780358] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.240s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1027.780566] env[59379]: ERROR nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1027.780566] env[59379]: Faults: ['InvalidArgument'] [ 1027.780566] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Traceback (most recent call last): [ 1027.780566] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1027.780566] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] self.driver.spawn(context, instance, image_meta, [ 1027.780566] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1027.780566] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1027.780566] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1027.780566] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] self._fetch_image_if_missing(context, vi) [ 1027.780566] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1027.780566] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] image_cache(vi, tmp_image_ds_loc) [ 1027.780566] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] vm_util.copy_virtual_disk( [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] session._wait_for_task(vmdk_copy_task) [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] return self.wait_for_task(task_ref) [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] return evt.wait() [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] result = hub.switch() [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] return self.greenlet.switch() [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1027.780885] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] self.f(*self.args, **self.kw) [ 1027.781236] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1027.781236] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] raise exceptions.translate_fault(task_info.error) [ 1027.781236] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1027.781236] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Faults: ['InvalidArgument'] [ 1027.781236] env[59379]: ERROR nova.compute.manager [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] [ 1027.781358] env[59379]: DEBUG nova.compute.utils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1027.782650] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Build of instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 was re-scheduled: A specified parameter was not correct: fileType [ 1027.782650] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1027.783047] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1027.783223] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1027.783373] env[59379]: DEBUG nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1027.783528] env[59379]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1028.025954] env[59379]: DEBUG nova.network.neutron [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1028.036878] env[59379]: INFO nova.compute.manager [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Took 0.25 seconds to deallocate network for instance. [ 1028.123312] env[59379]: INFO nova.scheduler.client.report [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Deleted allocations for instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 [ 1028.139301] env[59379]: DEBUG oslo_concurrency.lockutils [None req-18ce2558-2991-424f-9e61-8c4d378db819 tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "2545ca35-7a3f-47ed-b0de-e1bb26967379" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 439.943s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.139524] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "2545ca35-7a3f-47ed-b0de-e1bb26967379" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 434.293s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1028.139705] env[59379]: INFO nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] During sync_power_state the instance has a pending task (spawning). Skip. [ 1028.139854] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "2545ca35-7a3f-47ed-b0de-e1bb26967379" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.140151] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "2545ca35-7a3f-47ed-b0de-e1bb26967379" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 237.983s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1028.140360] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Acquiring lock "2545ca35-7a3f-47ed-b0de-e1bb26967379-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1028.140552] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "2545ca35-7a3f-47ed-b0de-e1bb26967379-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1028.140705] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "2545ca35-7a3f-47ed-b0de-e1bb26967379-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.142580] env[59379]: INFO nova.compute.manager [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Terminating instance [ 1028.144224] env[59379]: DEBUG nova.compute.manager [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1028.144412] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1028.144849] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-183bfccd-ad2f-43fe-a8f7-a759f14f1f62 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.154062] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d50f14d7-3053-4ca2-b256-23715e339e69 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.179550] env[59379]: WARNING nova.virt.vmwareapi.vmops [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2545ca35-7a3f-47ed-b0de-e1bb26967379 could not be found. [ 1028.179736] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1028.179969] env[59379]: INFO nova.compute.manager [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1028.180229] env[59379]: DEBUG oslo.service.loopingcall [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1028.180421] env[59379]: DEBUG nova.compute.manager [-] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1028.180513] env[59379]: DEBUG nova.network.neutron [-] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1028.204084] env[59379]: DEBUG nova.network.neutron [-] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1028.211972] env[59379]: INFO nova.compute.manager [-] [instance: 2545ca35-7a3f-47ed-b0de-e1bb26967379] Took 0.03 seconds to deallocate network for instance. [ 1028.296561] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c99527b5-76ca-4497-95ce-a14f6b4175ab tempest-ServerDiagnosticsTest-1458258893 tempest-ServerDiagnosticsTest-1458258893-project-member] Lock "2545ca35-7a3f-47ed-b0de-e1bb26967379" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.156s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1033.056883] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1047.772444] env[59379]: WARNING oslo_vmware.rw_handles [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1047.772444] env[59379]: ERROR oslo_vmware.rw_handles [ 1047.773062] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/51c0bbe1-5d23-4341-9a45-2743a5d78008/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1047.774722] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1047.775010] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Copying Virtual Disk [datastore1] vmware_temp/51c0bbe1-5d23-4341-9a45-2743a5d78008/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/51c0bbe1-5d23-4341-9a45-2743a5d78008/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1047.775328] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-540e9223-4ee1-4b91-8bad-e034836eb6ea {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1047.784098] env[59379]: DEBUG oslo_vmware.api [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Waiting for the task: (returnval){ [ 1047.784098] env[59379]: value = "task-559663" [ 1047.784098] env[59379]: _type = "Task" [ 1047.784098] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1047.791981] env[59379]: DEBUG oslo_vmware.api [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Task: {'id': task-559663, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1048.295105] env[59379]: DEBUG oslo_vmware.exceptions [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1048.295343] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1048.295877] env[59379]: ERROR nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1048.295877] env[59379]: Faults: ['InvalidArgument'] [ 1048.295877] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Traceback (most recent call last): [ 1048.295877] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1048.295877] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] yield resources [ 1048.295877] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1048.295877] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] self.driver.spawn(context, instance, image_meta, [ 1048.295877] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1048.295877] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1048.295877] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1048.295877] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] self._fetch_image_if_missing(context, vi) [ 1048.295877] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] image_cache(vi, tmp_image_ds_loc) [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] vm_util.copy_virtual_disk( [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] session._wait_for_task(vmdk_copy_task) [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] return self.wait_for_task(task_ref) [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] return evt.wait() [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] result = hub.switch() [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1048.296377] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] return self.greenlet.switch() [ 1048.296731] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1048.296731] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] self.f(*self.args, **self.kw) [ 1048.296731] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1048.296731] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] raise exceptions.translate_fault(task_info.error) [ 1048.296731] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1048.296731] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Faults: ['InvalidArgument'] [ 1048.296731] env[59379]: ERROR nova.compute.manager [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] [ 1048.296731] env[59379]: INFO nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Terminating instance [ 1048.297677] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1048.297871] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1048.298128] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2f70a56c-5748-4311-9b88-289444475e56 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.300257] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1048.300441] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1048.301197] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f1625e0-dd80-4247-9841-9c74833908cc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.308263] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1048.309196] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6b32e872-b461-4327-9a46-ffc9b110a626 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.310535] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1048.310699] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1048.311411] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-57e5f954-8f31-4b65-b828-4c0243ee3f31 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.316400] env[59379]: DEBUG oslo_vmware.api [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for the task: (returnval){ [ 1048.316400] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]527550c8-a4aa-7f35-b1f6-b557cad2e219" [ 1048.316400] env[59379]: _type = "Task" [ 1048.316400] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1048.323330] env[59379]: DEBUG oslo_vmware.api [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]527550c8-a4aa-7f35-b1f6-b557cad2e219, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1048.390365] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1048.390616] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1048.390755] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Deleting the datastore file [datastore1] 06d5ac6a-7734-46e3-80c5-d960821b7552 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1048.391011] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-341e24a8-b3f8-4816-9820-7f49eaf77865 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.396892] env[59379]: DEBUG oslo_vmware.api [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Waiting for the task: (returnval){ [ 1048.396892] env[59379]: value = "task-559665" [ 1048.396892] env[59379]: _type = "Task" [ 1048.396892] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1048.404264] env[59379]: DEBUG oslo_vmware.api [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Task: {'id': task-559665, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1048.826354] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1048.826730] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Creating directory with path [datastore1] vmware_temp/9acf6e7e-578a-494e-ad63-d2ece00a5c5c/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1048.826772] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7ac5c41f-7a63-4966-b32f-cb44b6d038e7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.838480] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Created directory with path [datastore1] vmware_temp/9acf6e7e-578a-494e-ad63-d2ece00a5c5c/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1048.838657] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Fetch image to [datastore1] vmware_temp/9acf6e7e-578a-494e-ad63-d2ece00a5c5c/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1048.838820] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/9acf6e7e-578a-494e-ad63-d2ece00a5c5c/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1048.839580] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24b4410d-50a2-4e11-9e48-4e03f3af3652 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.845849] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e62bbae-cee8-4d09-8a88-bedce6dd8183 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.855038] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cb6ff99-c601-4ce8-9abb-92297e06f953 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.886625] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f715940-3bb3-48e4-826d-ded4a36020a5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.892600] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a07dc7f9-7b49-4457-89ac-ee6a7c71ef95 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.904782] env[59379]: DEBUG oslo_vmware.api [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Task: {'id': task-559665, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074693} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1048.905013] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1048.905214] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1048.905383] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1048.905544] env[59379]: INFO nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1048.907634] env[59379]: DEBUG nova.compute.claims [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1048.907883] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1048.908200] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.915201] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1048.936398] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.936991] env[59379]: DEBUG nova.compute.utils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Instance 06d5ac6a-7734-46e3-80c5-d960821b7552 could not be found. {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1048.938472] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Instance disappeared during build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1048.938631] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1048.938783] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1048.938942] env[59379]: DEBUG nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1048.939112] env[59379]: DEBUG nova.network.neutron [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1048.958314] env[59379]: DEBUG oslo_vmware.rw_handles [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9acf6e7e-578a-494e-ad63-d2ece00a5c5c/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1049.008239] env[59379]: DEBUG nova.network.neutron [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1049.012488] env[59379]: DEBUG oslo_vmware.rw_handles [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1049.012652] env[59379]: DEBUG oslo_vmware.rw_handles [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9acf6e7e-578a-494e-ad63-d2ece00a5c5c/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1049.017210] env[59379]: INFO nova.compute.manager [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Took 0.08 seconds to deallocate network for instance. [ 1049.059939] env[59379]: DEBUG oslo_concurrency.lockutils [None req-5f170e36-7309-4b78-83bf-80c26bf2baf2 tempest-ServerActionsTestOtherB-1224251085 tempest-ServerActionsTestOtherB-1224251085-project-member] Lock "06d5ac6a-7734-46e3-80c5-d960821b7552" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 256.523s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1076.763151] env[59379]: WARNING oslo_vmware.rw_handles [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1076.763151] env[59379]: ERROR oslo_vmware.rw_handles [ 1076.763845] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/34ecb659-2763-4806-a7b3-ab13dc098d31/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1076.765526] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1076.765781] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Copying Virtual Disk [datastore2] vmware_temp/34ecb659-2763-4806-a7b3-ab13dc098d31/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/34ecb659-2763-4806-a7b3-ab13dc098d31/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1076.766118] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-923ce3ac-4cd4-4c1a-9ffa-e350de9bd8c9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.773920] env[59379]: DEBUG oslo_vmware.api [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Waiting for the task: (returnval){ [ 1076.773920] env[59379]: value = "task-559666" [ 1076.773920] env[59379]: _type = "Task" [ 1076.773920] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1076.781569] env[59379]: DEBUG oslo_vmware.api [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Task: {'id': task-559666, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1077.283959] env[59379]: DEBUG oslo_vmware.exceptions [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1077.284249] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1077.284779] env[59379]: ERROR nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1077.284779] env[59379]: Faults: ['InvalidArgument'] [ 1077.284779] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] Traceback (most recent call last): [ 1077.284779] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1077.284779] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] yield resources [ 1077.284779] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1077.284779] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] self.driver.spawn(context, instance, image_meta, [ 1077.284779] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1077.284779] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1077.284779] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1077.284779] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] self._fetch_image_if_missing(context, vi) [ 1077.284779] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] image_cache(vi, tmp_image_ds_loc) [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] vm_util.copy_virtual_disk( [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] session._wait_for_task(vmdk_copy_task) [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] return self.wait_for_task(task_ref) [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] return evt.wait() [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] result = hub.switch() [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1077.285177] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] return self.greenlet.switch() [ 1077.285581] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1077.285581] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] self.f(*self.args, **self.kw) [ 1077.285581] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1077.285581] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] raise exceptions.translate_fault(task_info.error) [ 1077.285581] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1077.285581] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] Faults: ['InvalidArgument'] [ 1077.285581] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] [ 1077.285581] env[59379]: INFO nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Terminating instance [ 1077.286584] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1077.286780] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1077.286997] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9c12929c-47dd-46c5-aada-4f7e17cc8a0c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.290398] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1077.290578] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1077.291339] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9991d8bd-81b6-4dce-8eb0-bfcad11f5182 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.299428] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1077.299623] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3ba01a42-8aba-416e-8a9d-dddcf02ffd78 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.301672] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1077.301830] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1077.302717] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6e486d40-dc87-4d29-98b8-f9e8ad9672da {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.307062] env[59379]: DEBUG oslo_vmware.api [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Waiting for the task: (returnval){ [ 1077.307062] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]520b6c81-6c04-5353-c1fd-d7758bd621c2" [ 1077.307062] env[59379]: _type = "Task" [ 1077.307062] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1077.320089] env[59379]: DEBUG oslo_vmware.api [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]520b6c81-6c04-5353-c1fd-d7758bd621c2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1077.375677] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1077.375835] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1077.376019] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Deleting the datastore file [datastore2] 71554abb-780c-4681-909f-8ff93712c82e {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1077.376264] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5b9d0ff5-8f0e-4a11-a04b-b13d9cfb5e32 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.382095] env[59379]: DEBUG oslo_vmware.api [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Waiting for the task: (returnval){ [ 1077.382095] env[59379]: value = "task-559668" [ 1077.382095] env[59379]: _type = "Task" [ 1077.382095] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1077.389628] env[59379]: DEBUG oslo_vmware.api [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Task: {'id': task-559668, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1077.817132] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1077.817506] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Creating directory with path [datastore2] vmware_temp/440adb29-928b-4249-8f24-08840d96f57a/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1077.817686] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fd2ab67c-4a17-4376-a709-9a6834a52844 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.830118] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Created directory with path [datastore2] vmware_temp/440adb29-928b-4249-8f24-08840d96f57a/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1077.830349] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Fetch image to [datastore2] vmware_temp/440adb29-928b-4249-8f24-08840d96f57a/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1077.830553] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/440adb29-928b-4249-8f24-08840d96f57a/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1077.831419] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78f70c7e-fb95-4758-9055-c41bfc6e9b35 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.839884] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e87410d3-4da5-4e3a-a6a0-68c02bf480bb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.849896] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-871340ce-b4cb-4913-babe-0e9ec183ab94 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.881505] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c264b6d8-8f7e-4f5f-a23f-fb136028ffe6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.893726] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-202f0a3d-09b2-42cf-a41f-cc61e81784ca {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.895705] env[59379]: DEBUG oslo_vmware.api [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Task: {'id': task-559668, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068751} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1077.895975] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1077.896195] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1077.896370] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1077.896538] env[59379]: INFO nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1077.899215] env[59379]: DEBUG nova.compute.claims [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1077.899465] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1077.899684] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1077.922027] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1077.976069] env[59379]: DEBUG oslo_vmware.rw_handles [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/440adb29-928b-4249-8f24-08840d96f57a/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1078.044141] env[59379]: DEBUG oslo_vmware.rw_handles [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1078.044331] env[59379]: DEBUG oslo_vmware.rw_handles [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/440adb29-928b-4249-8f24-08840d96f57a/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1078.077844] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5085230a-4ac9-4d4c-8050-3dbedafd36e6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.085806] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2ff584e-946a-46d7-8087-b55199fc73b6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.117764] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2e06957-53c7-4a35-9b85-62aceef7f4f0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.125726] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7749203-293f-4a63-8d5f-c2ffd03bf624 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.138833] env[59379]: DEBUG nova.compute.provider_tree [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1078.147496] env[59379]: DEBUG nova.scheduler.client.report [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1078.161498] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.262s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1078.162032] env[59379]: ERROR nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1078.162032] env[59379]: Faults: ['InvalidArgument'] [ 1078.162032] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] Traceback (most recent call last): [ 1078.162032] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1078.162032] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] self.driver.spawn(context, instance, image_meta, [ 1078.162032] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1078.162032] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1078.162032] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1078.162032] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] self._fetch_image_if_missing(context, vi) [ 1078.162032] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1078.162032] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] image_cache(vi, tmp_image_ds_loc) [ 1078.162032] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] vm_util.copy_virtual_disk( [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] session._wait_for_task(vmdk_copy_task) [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] return self.wait_for_task(task_ref) [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] return evt.wait() [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] result = hub.switch() [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] return self.greenlet.switch() [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1078.162685] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] self.f(*self.args, **self.kw) [ 1078.163384] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1078.163384] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] raise exceptions.translate_fault(task_info.error) [ 1078.163384] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1078.163384] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] Faults: ['InvalidArgument'] [ 1078.163384] env[59379]: ERROR nova.compute.manager [instance: 71554abb-780c-4681-909f-8ff93712c82e] [ 1078.163384] env[59379]: DEBUG nova.compute.utils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1078.164171] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Build of instance 71554abb-780c-4681-909f-8ff93712c82e was re-scheduled: A specified parameter was not correct: fileType [ 1078.164171] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1078.164525] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1078.164686] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1078.164843] env[59379]: DEBUG nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1078.164996] env[59379]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1078.457784] env[59379]: DEBUG nova.network.neutron [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1078.469961] env[59379]: INFO nova.compute.manager [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Took 0.30 seconds to deallocate network for instance. [ 1078.560853] env[59379]: INFO nova.scheduler.client.report [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Deleted allocations for instance 71554abb-780c-4681-909f-8ff93712c82e [ 1078.576812] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c52b5a55-3c9f-48b8-a97f-1445f3bd7286 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "71554abb-780c-4681-909f-8ff93712c82e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 476.294s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1078.576951] env[59379]: DEBUG oslo_concurrency.lockutils [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "71554abb-780c-4681-909f-8ff93712c82e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 277.643s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1078.577188] env[59379]: DEBUG oslo_concurrency.lockutils [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Acquiring lock "71554abb-780c-4681-909f-8ff93712c82e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1078.577399] env[59379]: DEBUG oslo_concurrency.lockutils [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "71554abb-780c-4681-909f-8ff93712c82e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1078.577557] env[59379]: DEBUG oslo_concurrency.lockutils [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "71554abb-780c-4681-909f-8ff93712c82e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1078.581780] env[59379]: INFO nova.compute.manager [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Terminating instance [ 1078.583619] env[59379]: DEBUG nova.compute.manager [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1078.583802] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1078.584064] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a196a740-cefe-4ffc-b090-a42fb0a9f0ba {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.593469] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26f2c2e6-39fe-47e8-9b22-d2ec49036b28 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.619291] env[59379]: WARNING nova.virt.vmwareapi.vmops [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 71554abb-780c-4681-909f-8ff93712c82e could not be found. [ 1078.619501] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1078.619674] env[59379]: INFO nova.compute.manager [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1078.619907] env[59379]: DEBUG oslo.service.loopingcall [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1078.620160] env[59379]: DEBUG nova.compute.manager [-] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1078.620256] env[59379]: DEBUG nova.network.neutron [-] [instance: 71554abb-780c-4681-909f-8ff93712c82e] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1078.644708] env[59379]: DEBUG nova.network.neutron [-] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1078.652855] env[59379]: INFO nova.compute.manager [-] [instance: 71554abb-780c-4681-909f-8ff93712c82e] Took 0.03 seconds to deallocate network for instance. [ 1078.741537] env[59379]: DEBUG oslo_concurrency.lockutils [None req-330b38d0-0cec-49e8-956e-6ca1920a6d88 tempest-VolumesAdminNegativeTest-2131673682 tempest-VolumesAdminNegativeTest-2131673682-project-member] Lock "71554abb-780c-4681-909f-8ff93712c82e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.165s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1082.429855] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1083.434164] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1084.433638] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1086.434307] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1086.434646] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1086.434646] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1086.447957] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1086.448144] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1086.448277] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1086.448409] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1086.448530] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Didn't find any instances for network info cache update. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1086.448943] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1087.433812] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1087.433812] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1088.433725] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1088.434049] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1088.443920] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1088.444152] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1088.444311] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1088.444464] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1088.445572] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3488e8cb-eaa9-483e-935f-8148108b3cb8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1088.454511] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-285dac1c-82f0-49bd-84e8-47d7573ea632 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1088.468366] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd238103-2630-471c-94ba-d5695a937d6e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1088.474282] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8673c38f-33b1-4b3b-881d-f4f5d2464ae9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1088.503783] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181778MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1088.503923] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1088.504091] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1088.550036] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1088.550036] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1088.550036] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 2e622c9d-369c-4c36-a477-3237bea4cf7c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1088.550036] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 9745bc90-6927-46a9-af48-df69046dc2a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1088.550228] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1088.550228] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1088.605246] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f04d9f26-d6b8-48f4-90e9-bfb3050057f0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1088.612552] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6885d8fe-cbb3-43a1-8d2b-cc5edb8ca540 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1088.641731] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a215d524-bb53-4394-89de-1cb8f59b2c97 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1088.648325] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27b8b025-ae4d-47fc-9b4b-a617a6ff5b81 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1088.661033] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1088.668957] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1088.683637] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1088.683813] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.180s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1089.684341] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1096.434490] env[59379]: WARNING oslo_vmware.rw_handles [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1096.434490] env[59379]: ERROR oslo_vmware.rw_handles [ 1096.435058] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/9acf6e7e-578a-494e-ad63-d2ece00a5c5c/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1096.436914] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1096.437215] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Copying Virtual Disk [datastore1] vmware_temp/9acf6e7e-578a-494e-ad63-d2ece00a5c5c/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/9acf6e7e-578a-494e-ad63-d2ece00a5c5c/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1096.437541] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-223bc1ef-7f17-4f4b-9fa7-0430a5ec2d03 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.451015] env[59379]: DEBUG oslo_vmware.api [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for the task: (returnval){ [ 1096.451015] env[59379]: value = "task-559669" [ 1096.451015] env[59379]: _type = "Task" [ 1096.451015] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1096.459149] env[59379]: DEBUG oslo_vmware.api [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Task: {'id': task-559669, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1096.961521] env[59379]: DEBUG oslo_vmware.exceptions [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1096.961784] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1096.962305] env[59379]: ERROR nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1096.962305] env[59379]: Faults: ['InvalidArgument'] [ 1096.962305] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Traceback (most recent call last): [ 1096.962305] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1096.962305] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] yield resources [ 1096.962305] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1096.962305] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] self.driver.spawn(context, instance, image_meta, [ 1096.962305] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1096.962305] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1096.962305] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1096.962305] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] self._fetch_image_if_missing(context, vi) [ 1096.962305] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] image_cache(vi, tmp_image_ds_loc) [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] vm_util.copy_virtual_disk( [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] session._wait_for_task(vmdk_copy_task) [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] return self.wait_for_task(task_ref) [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] return evt.wait() [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] result = hub.switch() [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1096.962759] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] return self.greenlet.switch() [ 1096.963128] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1096.963128] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] self.f(*self.args, **self.kw) [ 1096.963128] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1096.963128] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] raise exceptions.translate_fault(task_info.error) [ 1096.963128] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1096.963128] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Faults: ['InvalidArgument'] [ 1096.963128] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] [ 1096.963128] env[59379]: INFO nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Terminating instance [ 1096.964540] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1096.964756] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1096.965380] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1096.965559] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1096.965768] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8d32ae73-25a1-4563-a805-18855062619e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.969333] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2499a56-df27-47ce-af18-bfeacdcd02d4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.975848] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1096.976052] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-05967ce1-f286-4a85-8a94-e2d01b407292 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.978148] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1096.978314] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1096.979206] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0ed8e253-7e27-44ba-8ff8-4c80bcc7d085 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.983896] env[59379]: DEBUG oslo_vmware.api [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Waiting for the task: (returnval){ [ 1096.983896] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]521c332d-8990-89bc-bc5b-6827f0d6f3b9" [ 1096.983896] env[59379]: _type = "Task" [ 1096.983896] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1096.990543] env[59379]: DEBUG oslo_vmware.api [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]521c332d-8990-89bc-bc5b-6827f0d6f3b9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1097.046685] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1097.046923] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1097.047067] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Deleting the datastore file [datastore1] 64bc3ac9-57b4-4f50-97fa-ba684c1595b4 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1097.047318] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-38b269f4-6f22-4b49-b3df-70467d2d878b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.053990] env[59379]: DEBUG oslo_vmware.api [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for the task: (returnval){ [ 1097.053990] env[59379]: value = "task-559671" [ 1097.053990] env[59379]: _type = "Task" [ 1097.053990] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1097.061532] env[59379]: DEBUG oslo_vmware.api [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Task: {'id': task-559671, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1097.493998] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1097.494291] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Creating directory with path [datastore1] vmware_temp/b32647e5-a374-41c9-b52a-90eecbd25466/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1097.494492] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-720e7709-ca5b-4cff-8619-97fb8d7b9a5a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.505996] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Created directory with path [datastore1] vmware_temp/b32647e5-a374-41c9-b52a-90eecbd25466/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1097.506222] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Fetch image to [datastore1] vmware_temp/b32647e5-a374-41c9-b52a-90eecbd25466/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1097.506361] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/b32647e5-a374-41c9-b52a-90eecbd25466/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1097.507100] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1233ddd3-f157-49a3-ae25-bf8f41583fd6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.513658] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-207c4761-bdc4-4303-9b29-cebe61ad816d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.522442] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48b2d73c-37f1-42ac-b7ef-246407cb731b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.552486] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdc2009b-ca69-4bf8-85bb-c1a2fcad85f5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.562904] env[59379]: DEBUG oslo_vmware.api [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Task: {'id': task-559671, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076158} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1097.563092] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1564a171-be33-437a-9814-133a94c69a50 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.564673] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1097.564906] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1097.565101] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1097.565278] env[59379]: INFO nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1097.567291] env[59379]: DEBUG nova.compute.claims [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1097.567451] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1097.567658] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1097.585288] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1097.631338] env[59379]: DEBUG oslo_vmware.rw_handles [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b32647e5-a374-41c9-b52a-90eecbd25466/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1097.685690] env[59379]: DEBUG oslo_vmware.rw_handles [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1097.685855] env[59379]: DEBUG oslo_vmware.rw_handles [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b32647e5-a374-41c9-b52a-90eecbd25466/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1097.707486] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bfaa393-af64-40d8-a03f-32b65fe87ecc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.716417] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0d8678f-4a18-4ead-8a8f-d01272340d7b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.744978] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7baf663c-8c2e-4387-ab6c-abb58a0ba881 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.751309] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-238d36b3-fe09-40b4-8c55-b67630d39131 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.763626] env[59379]: DEBUG nova.compute.provider_tree [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1097.771515] env[59379]: DEBUG nova.scheduler.client.report [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1097.784098] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.216s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1097.784584] env[59379]: ERROR nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1097.784584] env[59379]: Faults: ['InvalidArgument'] [ 1097.784584] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Traceback (most recent call last): [ 1097.784584] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1097.784584] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] self.driver.spawn(context, instance, image_meta, [ 1097.784584] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1097.784584] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1097.784584] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1097.784584] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] self._fetch_image_if_missing(context, vi) [ 1097.784584] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1097.784584] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] image_cache(vi, tmp_image_ds_loc) [ 1097.784584] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] vm_util.copy_virtual_disk( [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] session._wait_for_task(vmdk_copy_task) [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] return self.wait_for_task(task_ref) [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] return evt.wait() [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] result = hub.switch() [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] return self.greenlet.switch() [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1097.784910] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] self.f(*self.args, **self.kw) [ 1097.785566] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1097.785566] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] raise exceptions.translate_fault(task_info.error) [ 1097.785566] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1097.785566] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Faults: ['InvalidArgument'] [ 1097.785566] env[59379]: ERROR nova.compute.manager [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] [ 1097.785566] env[59379]: DEBUG nova.compute.utils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1097.786531] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Build of instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4 was re-scheduled: A specified parameter was not correct: fileType [ 1097.786531] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1097.786894] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1097.787077] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1097.787225] env[59379]: DEBUG nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1097.787378] env[59379]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1097.997120] env[59379]: DEBUG nova.network.neutron [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1098.011846] env[59379]: INFO nova.compute.manager [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Took 0.22 seconds to deallocate network for instance. [ 1098.097202] env[59379]: INFO nova.scheduler.client.report [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Deleted allocations for instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4 [ 1098.116030] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9b15c83c-0519-41c7-9ec6-84bb45a65a53 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 260.789s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1098.116030] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 65.059s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1098.116552] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Acquiring lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1098.116552] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1098.116552] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1098.118343] env[59379]: INFO nova.compute.manager [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Terminating instance [ 1098.119951] env[59379]: DEBUG nova.compute.manager [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1098.120177] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1098.120601] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8ff06223-4d28-4aaf-9e08-f4a2fb37ba4f {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.129311] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de2e68a5-d787-42e4-b491-29f6895d84a6 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.153915] env[59379]: WARNING nova.virt.vmwareapi.vmops [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 64bc3ac9-57b4-4f50-97fa-ba684c1595b4 could not be found. [ 1098.154159] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1098.154299] env[59379]: INFO nova.compute.manager [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Took 0.03 seconds to destroy the instance on the hypervisor. [ 1098.154527] env[59379]: DEBUG oslo.service.loopingcall [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1098.154713] env[59379]: DEBUG nova.compute.manager [-] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1098.154800] env[59379]: DEBUG nova.network.neutron [-] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1098.176874] env[59379]: DEBUG nova.network.neutron [-] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1098.184287] env[59379]: INFO nova.compute.manager [-] [instance: 64bc3ac9-57b4-4f50-97fa-ba684c1595b4] Took 0.03 seconds to deallocate network for instance. [ 1098.263310] env[59379]: DEBUG oslo_concurrency.lockutils [None req-c238f67f-513c-4473-b08c-c33fc5076680 tempest-DeleteServersAdminTestJSON-1182615883 tempest-DeleteServersAdminTestJSON-1182615883-project-member] Lock "64bc3ac9-57b4-4f50-97fa-ba684c1595b4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.147s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1104.784733] env[59379]: DEBUG nova.compute.manager [req-9513d69e-0f5c-4592-8402-b6307e28f3d0 req-45585c46-ef0e-4f71-ac38-26156f0e64c9 service nova] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Received event network-vif-deleted-780fb21b-08f6-490a-9550-88ae379b00bc {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1126.444053] env[59379]: WARNING oslo_vmware.rw_handles [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1126.444053] env[59379]: ERROR oslo_vmware.rw_handles [ 1126.444053] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/440adb29-928b-4249-8f24-08840d96f57a/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1126.445383] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1126.445622] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Copying Virtual Disk [datastore2] vmware_temp/440adb29-928b-4249-8f24-08840d96f57a/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore2] vmware_temp/440adb29-928b-4249-8f24-08840d96f57a/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1126.445899] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1acfc551-7a63-4593-aa47-ff352bb41f3a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1126.454239] env[59379]: DEBUG oslo_vmware.api [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Waiting for the task: (returnval){ [ 1126.454239] env[59379]: value = "task-559672" [ 1126.454239] env[59379]: _type = "Task" [ 1126.454239] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1126.462319] env[59379]: DEBUG oslo_vmware.api [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Task: {'id': task-559672, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1126.964921] env[59379]: DEBUG oslo_vmware.exceptions [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1126.965159] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1126.965700] env[59379]: ERROR nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1126.965700] env[59379]: Faults: ['InvalidArgument'] [ 1126.965700] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Traceback (most recent call last): [ 1126.965700] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1126.965700] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] yield resources [ 1126.965700] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1126.965700] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] self.driver.spawn(context, instance, image_meta, [ 1126.965700] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1126.965700] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1126.965700] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1126.965700] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] self._fetch_image_if_missing(context, vi) [ 1126.965700] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] image_cache(vi, tmp_image_ds_loc) [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] vm_util.copy_virtual_disk( [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] session._wait_for_task(vmdk_copy_task) [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] return self.wait_for_task(task_ref) [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] return evt.wait() [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] result = hub.switch() [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1126.966072] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] return self.greenlet.switch() [ 1126.966404] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1126.966404] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] self.f(*self.args, **self.kw) [ 1126.966404] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1126.966404] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] raise exceptions.translate_fault(task_info.error) [ 1126.966404] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1126.966404] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Faults: ['InvalidArgument'] [ 1126.966404] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] [ 1126.966404] env[59379]: INFO nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Terminating instance [ 1126.967513] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquired lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1126.967711] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1126.967940] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-900a6603-fd23-401f-9110-f90145b222ea {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1126.969916] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "refresh_cache-789a3358-bc70-44a5-bb2f-4fc2f1ff9116" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1126.970109] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquired lock "refresh_cache-789a3358-bc70-44a5-bb2f-4fc2f1ff9116" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1126.970275] env[59379]: DEBUG nova.network.neutron [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1126.977165] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1126.977330] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1126.978455] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-04411ec3-3bf2-4ec9-b45e-d8243a40767e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1126.985477] env[59379]: DEBUG oslo_vmware.api [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Waiting for the task: (returnval){ [ 1126.985477] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52d4e264-ae07-c64e-c0a5-e0e280fae744" [ 1126.985477] env[59379]: _type = "Task" [ 1126.985477] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1126.992719] env[59379]: DEBUG oslo_vmware.api [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52d4e264-ae07-c64e-c0a5-e0e280fae744, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1126.997442] env[59379]: DEBUG nova.network.neutron [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1127.081278] env[59379]: DEBUG nova.network.neutron [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1127.089736] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Releasing lock "refresh_cache-789a3358-bc70-44a5-bb2f-4fc2f1ff9116" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1127.090177] env[59379]: DEBUG nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1127.090370] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1127.091412] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29029791-1289-4871-9429-a57183c879ef {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.099208] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1127.099429] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-10114a10-3ac1-41f5-9c7b-4485b492eee0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.135961] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1127.136221] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1127.136347] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Deleting the datastore file [datastore2] 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1127.136579] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-28f48573-498a-4fd8-8c6b-ef92f72baef2 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.142235] env[59379]: DEBUG oslo_vmware.api [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Waiting for the task: (returnval){ [ 1127.142235] env[59379]: value = "task-559674" [ 1127.142235] env[59379]: _type = "Task" [ 1127.142235] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1127.150870] env[59379]: DEBUG oslo_vmware.api [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Task: {'id': task-559674, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1127.496319] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1127.496666] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Creating directory with path [datastore2] vmware_temp/c3676409-a1bb-402c-81fc-f65a7ed2511e/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1127.496821] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-42d9a7da-232d-472c-98d6-bdc577592c2d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.507836] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Created directory with path [datastore2] vmware_temp/c3676409-a1bb-402c-81fc-f65a7ed2511e/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1127.508052] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Fetch image to [datastore2] vmware_temp/c3676409-a1bb-402c-81fc-f65a7ed2511e/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1127.508197] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore2] vmware_temp/c3676409-a1bb-402c-81fc-f65a7ed2511e/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore2 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1127.508911] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b098e32-68ff-410b-9cd6-cfe0010807c8 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.515625] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7db200ec-c2c9-427a-b874-aab57dfb2e90 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.525277] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3dcb211-2da7-40e8-8c7a-828206a401bb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.556722] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28a1575a-a613-4101-bb92-f41ba6de857e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.562532] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-790c50a6-5a0a-403a-ac4d-650c6f5305fe {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.585914] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore2 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1127.651259] env[59379]: DEBUG oslo_vmware.api [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Task: {'id': task-559674, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.032469} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1127.651514] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1127.651691] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1127.651855] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1127.652030] env[59379]: INFO nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1127.652268] env[59379]: DEBUG oslo.service.loopingcall [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1127.652479] env[59379]: DEBUG nova.compute.manager [-] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Skipping network deallocation for instance since networking was not requested. {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1127.654596] env[59379]: DEBUG nova.compute.claims [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1127.654754] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1127.654952] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1127.725962] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-299d9720-e7e3-4b83-a183-6e3652dd33ba {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.734823] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5acda9b-e886-4a8c-bb90-69f1e0fe0124 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.765481] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c130fae-45c5-43d5-a027-d01a829b0f8e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.772660] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-489119d4-22fd-4991-8997-18b2fd1d50d7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.778013] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Releasing lock "[datastore2] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1127.779541] env[59379]: ERROR nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image a816e082-61f0-4ffa-a214-1bf6bd197f53. [ 1127.779541] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Traceback (most recent call last): [ 1127.779541] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1127.779541] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1127.779541] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1127.779541] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] result = getattr(controller, method)(*args, **kwargs) [ 1127.779541] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1127.779541] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self._get(image_id) [ 1127.779541] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1127.779541] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1127.779541] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] resp, body = self.http_client.get(url, headers=header) [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self.request(url, 'GET', **kwargs) [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self._handle_response(resp) [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise exc.from_response(resp, resp.content) [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] During handling of the above exception, another exception occurred: [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1127.779852] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Traceback (most recent call last): [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] yield resources [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self.driver.spawn(context, instance, image_meta, [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self._fetch_image_if_missing(context, vi) [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] image_fetch(context, vi, tmp_image_ds_loc) [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] images.fetch_image( [ 1127.780248] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] metadata = IMAGE_API.get(context, image_ref) [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return session.show(context, image_id, [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] _reraise_translated_image_exception(image_id) [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise new_exc.with_traceback(exc_trace) [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] result = getattr(controller, method)(*args, **kwargs) [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1127.780646] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self._get(image_id) [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] resp, body = self.http_client.get(url, headers=header) [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self.request(url, 'GET', **kwargs) [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self._handle_response(resp) [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise exc.from_response(resp, resp.content) [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] nova.exception.ImageNotAuthorized: Not authorized for image a816e082-61f0-4ffa-a214-1bf6bd197f53. [ 1127.781030] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1127.781336] env[59379]: INFO nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Terminating instance [ 1127.788954] env[59379]: DEBUG nova.compute.provider_tree [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1127.790902] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1127.791136] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1127.791835] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fe3ccd2-79cd-47a0-b40d-a4ca41482bbb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.796746] env[59379]: DEBUG nova.scheduler.client.report [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1127.801541] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1127.801899] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c542f52b-abef-4e4e-b265-470a07712926 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.809488] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.154s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1127.809969] env[59379]: ERROR nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1127.809969] env[59379]: Faults: ['InvalidArgument'] [ 1127.809969] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Traceback (most recent call last): [ 1127.809969] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1127.809969] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] self.driver.spawn(context, instance, image_meta, [ 1127.809969] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1127.809969] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1127.809969] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1127.809969] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] self._fetch_image_if_missing(context, vi) [ 1127.809969] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1127.809969] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] image_cache(vi, tmp_image_ds_loc) [ 1127.809969] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] vm_util.copy_virtual_disk( [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] session._wait_for_task(vmdk_copy_task) [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] return self.wait_for_task(task_ref) [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] return evt.wait() [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] result = hub.switch() [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] return self.greenlet.switch() [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1127.810311] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] self.f(*self.args, **self.kw) [ 1127.810616] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1127.810616] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] raise exceptions.translate_fault(task_info.error) [ 1127.810616] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1127.810616] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Faults: ['InvalidArgument'] [ 1127.810616] env[59379]: ERROR nova.compute.manager [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] [ 1127.810737] env[59379]: DEBUG nova.compute.utils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1127.812032] env[59379]: DEBUG nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Build of instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 was re-scheduled: A specified parameter was not correct: fileType [ 1127.812032] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1127.812401] env[59379]: DEBUG nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1127.812610] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "refresh_cache-789a3358-bc70-44a5-bb2f-4fc2f1ff9116" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1127.812746] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquired lock "refresh_cache-789a3358-bc70-44a5-bb2f-4fc2f1ff9116" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1127.812895] env[59379]: DEBUG nova.network.neutron [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1127.835248] env[59379]: DEBUG nova.network.neutron [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1127.876042] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1127.876214] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Deleting contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1127.876256] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Deleting the datastore file [datastore2] 238825ed-3715-444c-be7c-f42f3884df7c {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1127.876490] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-008a47cb-9294-4167-84c1-374760a60382 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1127.883140] env[59379]: DEBUG oslo_vmware.api [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Waiting for the task: (returnval){ [ 1127.883140] env[59379]: value = "task-559676" [ 1127.883140] env[59379]: _type = "Task" [ 1127.883140] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1127.892157] env[59379]: DEBUG oslo_vmware.api [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Task: {'id': task-559676, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1127.894171] env[59379]: DEBUG nova.network.neutron [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1127.906426] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Releasing lock "refresh_cache-789a3358-bc70-44a5-bb2f-4fc2f1ff9116" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1127.906640] env[59379]: DEBUG nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1127.906829] env[59379]: DEBUG nova.compute.manager [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Skipping network deallocation for instance since networking was not requested. {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1127.988656] env[59379]: INFO nova.scheduler.client.report [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Deleted allocations for instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 [ 1128.006018] env[59379]: DEBUG oslo_concurrency.lockutils [None req-11c043ea-2ab6-4554-94b6-49c617be9c12 tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "789a3358-bc70-44a5-bb2f-4fc2f1ff9116" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 525.073s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1128.006260] env[59379]: DEBUG oslo_concurrency.lockutils [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "789a3358-bc70-44a5-bb2f-4fc2f1ff9116" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 328.749s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1128.006481] env[59379]: DEBUG oslo_concurrency.lockutils [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "789a3358-bc70-44a5-bb2f-4fc2f1ff9116-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1128.006680] env[59379]: DEBUG oslo_concurrency.lockutils [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "789a3358-bc70-44a5-bb2f-4fc2f1ff9116-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1128.006838] env[59379]: DEBUG oslo_concurrency.lockutils [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "789a3358-bc70-44a5-bb2f-4fc2f1ff9116-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1128.008556] env[59379]: INFO nova.compute.manager [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Terminating instance [ 1128.010341] env[59379]: DEBUG oslo_concurrency.lockutils [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquiring lock "refresh_cache-789a3358-bc70-44a5-bb2f-4fc2f1ff9116" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1128.010488] env[59379]: DEBUG oslo_concurrency.lockutils [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Acquired lock "refresh_cache-789a3358-bc70-44a5-bb2f-4fc2f1ff9116" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1128.010648] env[59379]: DEBUG nova.network.neutron [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1128.034503] env[59379]: DEBUG nova.network.neutron [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1128.089279] env[59379]: DEBUG nova.network.neutron [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1128.098200] env[59379]: DEBUG oslo_concurrency.lockutils [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Releasing lock "refresh_cache-789a3358-bc70-44a5-bb2f-4fc2f1ff9116" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1128.098537] env[59379]: DEBUG nova.compute.manager [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1128.098717] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1128.099185] env[59379]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-430ff3ac-919f-43e3-b71b-bbddf3cb352b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1128.107760] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f60d34a5-f5cf-4b07-9ab7-9cdeb107d5c1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1128.133646] env[59379]: WARNING nova.virt.vmwareapi.vmops [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 789a3358-bc70-44a5-bb2f-4fc2f1ff9116 could not be found. [ 1128.133825] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1128.133990] env[59379]: INFO nova.compute.manager [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1128.134226] env[59379]: DEBUG oslo.service.loopingcall [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1128.134406] env[59379]: DEBUG nova.compute.manager [-] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1128.134496] env[59379]: DEBUG nova.network.neutron [-] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1128.149756] env[59379]: DEBUG nova.network.neutron [-] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1128.156864] env[59379]: DEBUG nova.network.neutron [-] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1128.164518] env[59379]: INFO nova.compute.manager [-] [instance: 789a3358-bc70-44a5-bb2f-4fc2f1ff9116] Took 0.03 seconds to deallocate network for instance. [ 1128.239554] env[59379]: DEBUG oslo_concurrency.lockutils [None req-17d85dde-f013-475d-8fc4-eacad5043b2f tempest-ServerShowV254Test-1763719203 tempest-ServerShowV254Test-1763719203-project-member] Lock "789a3358-bc70-44a5-bb2f-4fc2f1ff9116" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.233s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1128.392663] env[59379]: DEBUG oslo_vmware.api [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Task: {'id': task-559676, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078169} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1128.392865] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1128.393014] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Deleted contents of the VM from datastore datastore2 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1128.393187] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1128.393348] env[59379]: INFO nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1128.395285] env[59379]: DEBUG nova.compute.claims [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1128.395448] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1128.395647] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1128.421260] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1128.421913] env[59379]: DEBUG nova.compute.utils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Instance 238825ed-3715-444c-be7c-f42f3884df7c could not be found. {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1128.423331] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Instance disappeared during build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1128.423492] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1128.423645] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1128.423818] env[59379]: DEBUG nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1128.423970] env[59379]: DEBUG nova.network.neutron [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1128.541400] env[59379]: DEBUG neutronclient.v2_0.client [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59379) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1128.542939] env[59379]: ERROR nova.compute.manager [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Traceback (most recent call last): [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] result = getattr(controller, method)(*args, **kwargs) [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self._get(image_id) [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1128.542939] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] resp, body = self.http_client.get(url, headers=header) [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self.request(url, 'GET', **kwargs) [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self._handle_response(resp) [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise exc.from_response(resp, resp.content) [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] During handling of the above exception, another exception occurred: [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Traceback (most recent call last): [ 1128.543324] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self.driver.spawn(context, instance, image_meta, [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self._fetch_image_if_missing(context, vi) [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] image_fetch(context, vi, tmp_image_ds_loc) [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] images.fetch_image( [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] metadata = IMAGE_API.get(context, image_ref) [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1128.543658] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return session.show(context, image_id, [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] _reraise_translated_image_exception(image_id) [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise new_exc.with_traceback(exc_trace) [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] result = getattr(controller, method)(*args, **kwargs) [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self._get(image_id) [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1128.544080] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] resp, body = self.http_client.get(url, headers=header) [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self.request(url, 'GET', **kwargs) [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self._handle_response(resp) [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise exc.from_response(resp, resp.content) [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] nova.exception.ImageNotAuthorized: Not authorized for image a816e082-61f0-4ffa-a214-1bf6bd197f53. [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] During handling of the above exception, another exception occurred: [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Traceback (most recent call last): [ 1128.544448] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self._build_and_run_instance(context, instance, image, [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] with excutils.save_and_reraise_exception(): [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self.force_reraise() [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise self.value [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] with self.rt.instance_claim(context, instance, node, allocs, [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self.abort() [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1128.544803] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self.tracker.abort_instance_claim(self.context, self.instance, [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return f(*args, **kwargs) [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self._unset_instance_host_and_node(instance) [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] instance.save() [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] updates, result = self.indirection_api.object_action( [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return cctxt.call(context, 'object_action', objinst=objinst, [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1128.545182] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] result = self.transport._send( [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self._driver.send(target, ctxt, message, [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise result [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] nova.exception_Remote.InstanceNotFound_Remote: Instance 238825ed-3715-444c-be7c-f42f3884df7c could not be found. [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Traceback (most recent call last): [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return getattr(target, method)(*args, **kwargs) [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.545548] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return fn(self, *args, **kwargs) [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] old_ref, inst_ref = db.instance_update_and_get_original( [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return f(*args, **kwargs) [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] with excutils.save_and_reraise_exception() as ectxt: [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self.force_reraise() [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.545967] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise self.value [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return f(*args, **kwargs) [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return f(context, *args, **kwargs) [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise exception.InstanceNotFound(instance_id=uuid) [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.546395] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] nova.exception.InstanceNotFound: Instance 238825ed-3715-444c-be7c-f42f3884df7c could not be found. [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] During handling of the above exception, another exception occurred: [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Traceback (most recent call last): [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] ret = obj(*args, **kwargs) [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] exception_handler_v20(status_code, error_body) [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise client_exc(message=error_message, [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Neutron server returns request_ids: ['req-2afaad54-0e29-4690-99f0-84a1a97ad5fe'] [ 1128.546809] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] During handling of the above exception, another exception occurred: [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Traceback (most recent call last): [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self._deallocate_network(context, instance, requested_networks) [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self.network_api.deallocate_for_instance( [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] data = neutron.list_ports(**search_opts) [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] ret = obj(*args, **kwargs) [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1128.547206] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self.list('ports', self.ports_path, retrieve_all, [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] ret = obj(*args, **kwargs) [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] for r in self._pagination(collection, path, **params): [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] res = self.get(path, params=params) [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] ret = obj(*args, **kwargs) [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self.retry_request("GET", action, body=body, [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] ret = obj(*args, **kwargs) [ 1128.547603] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1128.547982] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] return self.do_request(method, action, body=body, [ 1128.547982] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1128.547982] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] ret = obj(*args, **kwargs) [ 1128.547982] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1128.547982] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] self._handle_fault_response(status_code, replybody, resp) [ 1128.547982] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1128.547982] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] raise exception.Unauthorized() [ 1128.547982] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] nova.exception.Unauthorized: Not authorized. [ 1128.547982] env[59379]: ERROR nova.compute.manager [instance: 238825ed-3715-444c-be7c-f42f3884df7c] [ 1128.562789] env[59379]: DEBUG oslo_concurrency.lockutils [None req-53e9ea92-0458-4457-a018-7443031d5b85 tempest-ServerActionsTestOtherA-889365662 tempest-ServerActionsTestOtherA-889365662-project-member] Lock "238825ed-3715-444c-be7c-f42f3884df7c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 461.046s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1135.427733] env[59379]: DEBUG nova.compute.manager [req-60a3183a-12db-4c42-b154-aae0d612f3cf req-a2aa3273-5bb0-4e62-a5f5-8317667ea052 service nova] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Received event network-vif-deleted-4ca1c8c8-9412-4f33-85e0-c657fee7af8c {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1140.435094] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1140.435094] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Cleaning up deleted instances {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11095}} [ 1140.463257] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] There are 8 instances to clean {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11104}} [ 1140.463447] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Instance has had 0 of 5 cleanup attempts {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1140.487214] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Instance has had 0 of 5 cleanup attempts {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1140.535944] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 06d5ac6a-7734-46e3-80c5-d960821b7552] Instance has had 0 of 5 cleanup attempts {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1140.557055] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 13aee471-4813-4376-a7bf-70f266d9a399] Instance has had 0 of 5 cleanup attempts {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1140.578134] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 05010bc2-c30a-49bf-8daa-3eec6a5e9022] Instance has had 0 of 5 cleanup attempts {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1140.599188] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 03742e11-0fb2-48e2-9093-77ea7b647bf3] Instance has had 0 of 5 cleanup attempts {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1140.620235] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: a6ff207e-a925-46d1-9aaf-e06268d3c6f2] Instance has had 0 of 5 cleanup attempts {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1140.639582] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 238825ed-3715-444c-be7c-f42f3884df7c] Instance has had 0 of 5 cleanup attempts {{(pid=59379) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.434528] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1144.434822] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1144.434940] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Cleaning up deleted instances with incomplete migration {{(pid=59379) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11133}} [ 1145.444309] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1146.433875] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1146.434066] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1146.434173] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1146.442889] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Didn't find any instances for network info cache update. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1146.442889] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1146.789030] env[59379]: WARNING oslo_vmware.rw_handles [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1146.789030] env[59379]: ERROR oslo_vmware.rw_handles [ 1146.789030] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/b32647e5-a374-41c9-b52a-90eecbd25466/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1146.789782] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1146.790153] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Copying Virtual Disk [datastore1] vmware_temp/b32647e5-a374-41c9-b52a-90eecbd25466/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/b32647e5-a374-41c9-b52a-90eecbd25466/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1146.790559] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a4ac026f-c53d-46aa-b0da-082b19e430f0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1146.800486] env[59379]: DEBUG oslo_vmware.api [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Waiting for the task: (returnval){ [ 1146.800486] env[59379]: value = "task-559677" [ 1146.800486] env[59379]: _type = "Task" [ 1146.800486] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1146.807907] env[59379]: DEBUG oslo_vmware.api [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Task: {'id': task-559677, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.311017] env[59379]: DEBUG oslo_vmware.exceptions [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1147.311230] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1147.311763] env[59379]: ERROR nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1147.311763] env[59379]: Faults: ['InvalidArgument'] [ 1147.311763] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Traceback (most recent call last): [ 1147.311763] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1147.311763] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] yield resources [ 1147.311763] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1147.311763] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] self.driver.spawn(context, instance, image_meta, [ 1147.311763] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1147.311763] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1147.311763] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1147.311763] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] self._fetch_image_if_missing(context, vi) [ 1147.311763] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] image_cache(vi, tmp_image_ds_loc) [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] vm_util.copy_virtual_disk( [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] session._wait_for_task(vmdk_copy_task) [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] return self.wait_for_task(task_ref) [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] return evt.wait() [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] result = hub.switch() [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1147.312142] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] return self.greenlet.switch() [ 1147.312598] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1147.312598] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] self.f(*self.args, **self.kw) [ 1147.312598] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1147.312598] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] raise exceptions.translate_fault(task_info.error) [ 1147.312598] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1147.312598] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Faults: ['InvalidArgument'] [ 1147.312598] env[59379]: ERROR nova.compute.manager [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] [ 1147.312598] env[59379]: INFO nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Terminating instance [ 1147.313595] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1147.313792] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.314412] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1147.314588] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1147.314799] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-29950af1-bb42-4dbd-a4c4-34ade37d5cc4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.317009] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-381720a5-8ec5-4235-a1a6-1d005b79a4da {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.323950] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1147.324156] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d10df083-83c5-4f40-93e8-fc0374ccbdb3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.326247] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.326408] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1147.327319] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d17ffd25-9ef2-4227-8f39-37426f0301fc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.331820] env[59379]: DEBUG oslo_vmware.api [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Waiting for the task: (returnval){ [ 1147.331820] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]52cdf51a-1a56-af7a-bfed-c6fcc32b9edc" [ 1147.331820] env[59379]: _type = "Task" [ 1147.331820] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1147.338774] env[59379]: DEBUG oslo_vmware.api [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]52cdf51a-1a56-af7a-bfed-c6fcc32b9edc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.409645] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1147.409801] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1147.410089] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Deleting the datastore file [datastore1] 2e622c9d-369c-4c36-a477-3237bea4cf7c {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1147.410475] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7b95ba7a-be72-45b6-ba35-166604bfbf54 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.416772] env[59379]: DEBUG oslo_vmware.api [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Waiting for the task: (returnval){ [ 1147.416772] env[59379]: value = "task-559679" [ 1147.416772] env[59379]: _type = "Task" [ 1147.416772] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1147.424676] env[59379]: DEBUG oslo_vmware.api [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Task: {'id': task-559679, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1147.433192] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1147.433396] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1147.843265] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1147.843547] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Creating directory with path [datastore1] vmware_temp/3b7c39c4-d9c9-43c3-afb0-629acc1f1deb/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1147.843547] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-80a707c8-9d6f-40de-b8de-cef070cc253d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.854893] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Created directory with path [datastore1] vmware_temp/3b7c39c4-d9c9-43c3-afb0-629acc1f1deb/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1147.855097] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Fetch image to [datastore1] vmware_temp/3b7c39c4-d9c9-43c3-afb0-629acc1f1deb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1147.855263] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/3b7c39c4-d9c9-43c3-afb0-629acc1f1deb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1147.855985] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0507be60-13d2-4452-9219-1db2f6b68863 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.862951] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8522051-af4a-401c-b1d6-7024210f8630 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.872896] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1861fdf2-5e23-4abe-a565-6573f7d43c7d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.905987] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d39057a-1113-4c87-95d9-c21e949a43eb {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.912018] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b3066be3-9290-4a86-bed8-7e8f9546c835 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1147.925718] env[59379]: DEBUG oslo_vmware.api [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Task: {'id': task-559679, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069427} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1147.925939] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1147.926126] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1147.926293] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1147.926461] env[59379]: INFO nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1147.928630] env[59379]: DEBUG nova.compute.claims [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1147.928755] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1147.928963] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1147.934144] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1147.953123] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1147.953808] env[59379]: DEBUG nova.compute.utils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Instance 2e622c9d-369c-4c36-a477-3237bea4cf7c could not be found. {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1147.958393] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Instance disappeared during build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1147.958596] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1147.958700] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1147.958853] env[59379]: DEBUG nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1147.959016] env[59379]: DEBUG nova.network.neutron [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1147.981980] env[59379]: DEBUG oslo_vmware.rw_handles [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3b7c39c4-d9c9-43c3-afb0-629acc1f1deb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1148.032917] env[59379]: DEBUG nova.network.neutron [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1148.037539] env[59379]: DEBUG oslo_vmware.rw_handles [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1148.037539] env[59379]: DEBUG oslo_vmware.rw_handles [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3b7c39c4-d9c9-43c3-afb0-629acc1f1deb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1148.043575] env[59379]: INFO nova.compute.manager [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] [instance: 2e622c9d-369c-4c36-a477-3237bea4cf7c] Took 0.08 seconds to deallocate network for instance. [ 1148.159224] env[59379]: DEBUG oslo_concurrency.lockutils [None req-768b9f98-0bb2-48e7-a254-2f2ed5cacaeb tempest-ServerActionsTestJSON-1462263913 tempest-ServerActionsTestJSON-1462263913-project-member] Lock "2e622c9d-369c-4c36-a477-3237bea4cf7c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.463s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.433091] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1148.433338] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1148.443506] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1148.443655] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.443846] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.444016] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1148.445101] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c521ffe9-1180-4c01-8953-afe099c8ad02 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.454102] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b90ec975-1104-4e55-b231-ae50adcda157 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.469372] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffea9935-b261-4e8a-9b86-8b4631ffcd0a {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.475540] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d8126fb-4921-4b2f-aa46-e8e718d7007c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.504919] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181694MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1148.505098] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1148.505300] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1148.538890] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1148.539082] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1148.560937] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c8deb8b-7578-41a6-968c-4c486dc7e042 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.568572] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b17a7ffd-3011-4e4a-a237-edf7b7469c61 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.598402] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-543aac13-4923-46dc-9980-aa9304d1a782 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.605669] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c037b2d-0cb6-40ce-8eba-84f835288e79 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1148.620498] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1148.630438] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1148.644971] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1148.645605] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1148.645605] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1149.436008] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "3c337acb-ce39-44b5-a898-b40d7f4d5234" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1149.436318] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "3c337acb-ce39-44b5-a898-b40d7f4d5234" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1149.445487] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Starting instance... {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1149.490103] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1149.490358] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1149.491808] env[59379]: INFO nova.compute.claims [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1149.560679] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bffe581b-5bc2-4cc5-8f25-1d1ffe7f5c4b {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.568528] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed911522-5c96-4944-8836-dc15bc75577d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.597711] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b308217-4cf6-4799-8f24-04bc107600e4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.604678] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d10d185-c542-4751-bedd-f0c3e107ad08 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.619013] env[59379]: DEBUG nova.compute.provider_tree [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1149.628359] env[59379]: DEBUG nova.scheduler.client.report [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1149.643289] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1149.643753] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Start building networks asynchronously for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1149.656337] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1149.656518] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1149.680613] env[59379]: DEBUG nova.compute.utils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Using /dev/sd instead of None {{(pid=59379) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1149.681751] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Allocating IP information in the background. {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1149.681916] env[59379]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] allocate_for_instance() {{(pid=59379) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1149.691791] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Start building block device mappings for instance. {{(pid=59379) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1149.737698] env[59379]: DEBUG nova.policy [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd469ed909114450e86057d08dd15d305', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd4ef55cbd57248dbb887968a4efde03b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59379) authorize /opt/stack/nova/nova/policy.py:203}} [ 1149.752548] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Start spawning the instance on the hypervisor. {{(pid=59379) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1149.774982] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-08-31T09:43:23Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-08-31T09:43:05Z,direct_url=,disk_format='vmdk',id=a816e082-61f0-4ffa-a214-1bf6bd197f53,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='e8722a37ef8f4279abfe709d29d7d3ca',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-08-31T09:43:06Z,virtual_size=,visibility=), allow threads: False {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1149.775268] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Flavor limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1149.775468] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Image limits 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1149.775686] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Flavor pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1149.775852] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Image pref 0:0:0 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1149.776033] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59379) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1149.776274] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1149.776460] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1149.776631] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Got 1 possible topologies {{(pid=59379) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1149.776796] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1149.776984] env[59379]: DEBUG nova.virt.hardware [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59379) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1149.777810] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2db6f520-e1c4-466e-b3c7-c8bf6427377d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1149.785488] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-701adeae-9d99-4de2-b79d-454397711edc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1150.027368] env[59379]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Successfully created port: e6e3ca6a-5321-4077-b804-a611dadeed5f {{(pid=59379) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1150.429021] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1150.489393] env[59379]: DEBUG nova.compute.manager [req-c2501795-db68-4043-9557-2fbc8a8f3e23 req-806fd989-47f5-4a57-a8c6-60394955c76e service nova] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Received event network-vif-plugged-e6e3ca6a-5321-4077-b804-a611dadeed5f {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1150.489639] env[59379]: DEBUG oslo_concurrency.lockutils [req-c2501795-db68-4043-9557-2fbc8a8f3e23 req-806fd989-47f5-4a57-a8c6-60394955c76e service nova] Acquiring lock "3c337acb-ce39-44b5-a898-b40d7f4d5234-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1150.489791] env[59379]: DEBUG oslo_concurrency.lockutils [req-c2501795-db68-4043-9557-2fbc8a8f3e23 req-806fd989-47f5-4a57-a8c6-60394955c76e service nova] Lock "3c337acb-ce39-44b5-a898-b40d7f4d5234-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1150.489944] env[59379]: DEBUG oslo_concurrency.lockutils [req-c2501795-db68-4043-9557-2fbc8a8f3e23 req-806fd989-47f5-4a57-a8c6-60394955c76e service nova] Lock "3c337acb-ce39-44b5-a898-b40d7f4d5234-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1150.490172] env[59379]: DEBUG nova.compute.manager [req-c2501795-db68-4043-9557-2fbc8a8f3e23 req-806fd989-47f5-4a57-a8c6-60394955c76e service nova] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] No waiting events found dispatching network-vif-plugged-e6e3ca6a-5321-4077-b804-a611dadeed5f {{(pid=59379) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1150.490334] env[59379]: WARNING nova.compute.manager [req-c2501795-db68-4043-9557-2fbc8a8f3e23 req-806fd989-47f5-4a57-a8c6-60394955c76e service nova] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Received unexpected event network-vif-plugged-e6e3ca6a-5321-4077-b804-a611dadeed5f for instance with vm_state building and task_state spawning. [ 1150.561782] env[59379]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Successfully updated port: e6e3ca6a-5321-4077-b804-a611dadeed5f {{(pid=59379) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1150.570524] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "refresh_cache-3c337acb-ce39-44b5-a898-b40d7f4d5234" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1150.570630] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquired lock "refresh_cache-3c337acb-ce39-44b5-a898-b40d7f4d5234" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1150.570772] env[59379]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Building network info cache for instance {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1150.607625] env[59379]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Instance cache missing network info. {{(pid=59379) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1150.750771] env[59379]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Updating instance_info_cache with network_info: [{"id": "e6e3ca6a-5321-4077-b804-a611dadeed5f", "address": "fa:16:3e:01:ca:59", "network": {"id": "574aa782-18f9-49f4-8f82-dace0de59cfc", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1813753841-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d4ef55cbd57248dbb887968a4efde03b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "75ff81f9-72b2-4e58-a8d8-5699907f7459", "external-id": "nsx-vlan-transportzone-978", "segmentation_id": 978, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6e3ca6a-53", "ovs_interfaceid": "e6e3ca6a-5321-4077-b804-a611dadeed5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1150.763544] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Releasing lock "refresh_cache-3c337acb-ce39-44b5-a898-b40d7f4d5234" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1150.763830] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Instance network_info: |[{"id": "e6e3ca6a-5321-4077-b804-a611dadeed5f", "address": "fa:16:3e:01:ca:59", "network": {"id": "574aa782-18f9-49f4-8f82-dace0de59cfc", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1813753841-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d4ef55cbd57248dbb887968a4efde03b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "75ff81f9-72b2-4e58-a8d8-5699907f7459", "external-id": "nsx-vlan-transportzone-978", "segmentation_id": 978, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6e3ca6a-53", "ovs_interfaceid": "e6e3ca6a-5321-4077-b804-a611dadeed5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59379) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1150.764201] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:01:ca:59', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '75ff81f9-72b2-4e58-a8d8-5699907f7459', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e6e3ca6a-5321-4077-b804-a611dadeed5f', 'vif_model': 'vmxnet3'}] {{(pid=59379) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1150.772171] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Creating folder: Project (d4ef55cbd57248dbb887968a4efde03b). Parent ref: group-v140509. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1150.772656] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-36ff519a-3c6d-4d5b-95ed-313e81fcb74e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1150.782743] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Created folder: Project (d4ef55cbd57248dbb887968a4efde03b) in parent group-v140509. [ 1150.782910] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Creating folder: Instances. Parent ref: group-v140586. {{(pid=59379) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1150.783123] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c16842b6-d099-45cf-8cc4-640fb345d763 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1150.791276] env[59379]: INFO nova.virt.vmwareapi.vm_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Created folder: Instances in parent group-v140586. [ 1150.791471] env[59379]: DEBUG oslo.service.loopingcall [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59379) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1150.791628] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Creating VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1150.791790] env[59379]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-eac90959-2695-40c2-9375-e4973a062548 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1150.809865] env[59379]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1150.809865] env[59379]: value = "task-559682" [ 1150.809865] env[59379]: _type = "Task" [ 1150.809865] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1150.816808] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559682, 'name': CreateVM_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1151.319291] env[59379]: DEBUG oslo_vmware.api [-] Task: {'id': task-559682, 'name': CreateVM_Task, 'duration_secs': 0.307324} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1151.319465] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Created VM on the ESX host {{(pid=59379) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1151.320212] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1151.320420] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1151.320739] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1151.320976] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3415ef71-838d-4721-88bf-8ebc8e5e0ffd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1151.325063] env[59379]: DEBUG oslo_vmware.api [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Waiting for the task: (returnval){ [ 1151.325063] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]526872e3-5f67-db16-1320-8b9f25cbbd8a" [ 1151.325063] env[59379]: _type = "Task" [ 1151.325063] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1151.332041] env[59379]: DEBUG oslo_vmware.api [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]526872e3-5f67-db16-1320-8b9f25cbbd8a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1151.834944] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1151.835309] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Processing image a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1151.835401] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1152.520056] env[59379]: DEBUG nova.compute.manager [req-2e914fb3-74de-45de-9217-0def7ea03297 req-ad59e9ed-92b8-4027-b913-8230563494da service nova] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Received event network-changed-e6e3ca6a-5321-4077-b804-a611dadeed5f {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1152.520397] env[59379]: DEBUG nova.compute.manager [req-2e914fb3-74de-45de-9217-0def7ea03297 req-ad59e9ed-92b8-4027-b913-8230563494da service nova] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Refreshing instance network info cache due to event network-changed-e6e3ca6a-5321-4077-b804-a611dadeed5f. {{(pid=59379) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1152.520514] env[59379]: DEBUG oslo_concurrency.lockutils [req-2e914fb3-74de-45de-9217-0def7ea03297 req-ad59e9ed-92b8-4027-b913-8230563494da service nova] Acquiring lock "refresh_cache-3c337acb-ce39-44b5-a898-b40d7f4d5234" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1152.520660] env[59379]: DEBUG oslo_concurrency.lockutils [req-2e914fb3-74de-45de-9217-0def7ea03297 req-ad59e9ed-92b8-4027-b913-8230563494da service nova] Acquired lock "refresh_cache-3c337acb-ce39-44b5-a898-b40d7f4d5234" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1152.520887] env[59379]: DEBUG nova.network.neutron [req-2e914fb3-74de-45de-9217-0def7ea03297 req-ad59e9ed-92b8-4027-b913-8230563494da service nova] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Refreshing network info cache for port e6e3ca6a-5321-4077-b804-a611dadeed5f {{(pid=59379) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1152.787760] env[59379]: DEBUG nova.network.neutron [req-2e914fb3-74de-45de-9217-0def7ea03297 req-ad59e9ed-92b8-4027-b913-8230563494da service nova] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Updated VIF entry in instance network info cache for port e6e3ca6a-5321-4077-b804-a611dadeed5f. {{(pid=59379) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1152.788185] env[59379]: DEBUG nova.network.neutron [req-2e914fb3-74de-45de-9217-0def7ea03297 req-ad59e9ed-92b8-4027-b913-8230563494da service nova] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Updating instance_info_cache with network_info: [{"id": "e6e3ca6a-5321-4077-b804-a611dadeed5f", "address": "fa:16:3e:01:ca:59", "network": {"id": "574aa782-18f9-49f4-8f82-dace0de59cfc", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1813753841-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d4ef55cbd57248dbb887968a4efde03b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "75ff81f9-72b2-4e58-a8d8-5699907f7459", "external-id": "nsx-vlan-transportzone-978", "segmentation_id": 978, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6e3ca6a-53", "ovs_interfaceid": "e6e3ca6a-5321-4077-b804-a611dadeed5f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1152.797483] env[59379]: DEBUG oslo_concurrency.lockutils [req-2e914fb3-74de-45de-9217-0def7ea03297 req-ad59e9ed-92b8-4027-b913-8230563494da service nova] Releasing lock "refresh_cache-3c337acb-ce39-44b5-a898-b40d7f4d5234" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1193.794645] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1193.805406] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Getting list of instances from cluster (obj){ [ 1193.805406] env[59379]: value = "domain-c8" [ 1193.805406] env[59379]: _type = "ClusterComputeResource" [ 1193.805406] env[59379]: } {{(pid=59379) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1193.806439] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa4bb36f-1721-4b73-9576-b0245b55f6e7 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.817141] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Got total of 2 instances {{(pid=59379) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1193.817298] env[59379]: WARNING nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] While synchronizing instance power states, found 1 instances in the database and 2 instances on the hypervisor. [ 1193.817435] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Triggering sync for uuid 3c337acb-ce39-44b5-a898-b40d7f4d5234 {{(pid=59379) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 1193.817734] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "3c337acb-ce39-44b5-a898-b40d7f4d5234" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1194.754335] env[59379]: WARNING oslo_vmware.rw_handles [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1194.754335] env[59379]: ERROR oslo_vmware.rw_handles [ 1194.754855] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/3b7c39c4-d9c9-43c3-afb0-629acc1f1deb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1194.756662] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1194.756916] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Copying Virtual Disk [datastore1] vmware_temp/3b7c39c4-d9c9-43c3-afb0-629acc1f1deb/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/3b7c39c4-d9c9-43c3-afb0-629acc1f1deb/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1194.757216] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e92f28f5-7d2c-4cd8-a09d-b4229cced496 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.765022] env[59379]: DEBUG oslo_vmware.api [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Waiting for the task: (returnval){ [ 1194.765022] env[59379]: value = "task-559683" [ 1194.765022] env[59379]: _type = "Task" [ 1194.765022] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1194.772455] env[59379]: DEBUG oslo_vmware.api [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Task: {'id': task-559683, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1195.274856] env[59379]: DEBUG oslo_vmware.exceptions [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1195.275258] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1195.275630] env[59379]: ERROR nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1195.275630] env[59379]: Faults: ['InvalidArgument'] [ 1195.275630] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Traceback (most recent call last): [ 1195.275630] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1195.275630] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] yield resources [ 1195.275630] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1195.275630] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] self.driver.spawn(context, instance, image_meta, [ 1195.275630] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1195.275630] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1195.275630] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1195.275630] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] self._fetch_image_if_missing(context, vi) [ 1195.275630] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] image_cache(vi, tmp_image_ds_loc) [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] vm_util.copy_virtual_disk( [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] session._wait_for_task(vmdk_copy_task) [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] return self.wait_for_task(task_ref) [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] return evt.wait() [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] result = hub.switch() [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1195.275981] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] return self.greenlet.switch() [ 1195.276503] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1195.276503] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] self.f(*self.args, **self.kw) [ 1195.276503] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1195.276503] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] raise exceptions.translate_fault(task_info.error) [ 1195.276503] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1195.276503] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Faults: ['InvalidArgument'] [ 1195.276503] env[59379]: ERROR nova.compute.manager [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] [ 1195.276503] env[59379]: INFO nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Terminating instance [ 1195.278329] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquired lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1195.278329] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1195.278329] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-824426e5-c10d-442d-9a2d-597bc06133b9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.280575] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1195.280751] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1195.281481] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8ef9351-ce2e-4915-bf18-d837788c0ab5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.288120] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1195.288324] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ee62b554-0d0e-4f92-a5f6-c2094cbdc607 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.290523] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1195.290685] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59379) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1195.291596] env[59379]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0c6458bc-cde8-4523-836f-26ef97f79f6d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.296156] env[59379]: DEBUG oslo_vmware.api [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Waiting for the task: (returnval){ [ 1195.296156] env[59379]: value = "session[52bab112-692d-5758-5f79-71c0094a80bb]525aab5b-0b95-139e-2bab-3873b184afc2" [ 1195.296156] env[59379]: _type = "Task" [ 1195.296156] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1195.304789] env[59379]: DEBUG oslo_vmware.api [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Task: {'id': session[52bab112-692d-5758-5f79-71c0094a80bb]525aab5b-0b95-139e-2bab-3873b184afc2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1195.355885] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1195.356206] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1195.356470] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Deleting the datastore file [datastore1] 9745bc90-6927-46a9-af48-df69046dc2a2 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1195.356759] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4a98ff99-b673-4def-8343-15dde5022b0e {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.362775] env[59379]: DEBUG oslo_vmware.api [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Waiting for the task: (returnval){ [ 1195.362775] env[59379]: value = "task-559685" [ 1195.362775] env[59379]: _type = "Task" [ 1195.362775] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1195.370166] env[59379]: DEBUG oslo_vmware.api [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Task: {'id': task-559685, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1195.806417] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Preparing fetch location {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1195.806665] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Creating directory with path [datastore1] vmware_temp/16cc1936-9e00-4469-8b4b-440730b29025/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1195.806851] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e98fb616-cb17-41ae-b19c-f7851e9ed6dd {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.817487] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Created directory with path [datastore1] vmware_temp/16cc1936-9e00-4469-8b4b-440730b29025/a816e082-61f0-4ffa-a214-1bf6bd197f53 {{(pid=59379) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1195.817654] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Fetch image to [datastore1] vmware_temp/16cc1936-9e00-4469-8b4b-440730b29025/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1195.817815] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to [datastore1] vmware_temp/16cc1936-9e00-4469-8b4b-440730b29025/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1195.818533] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c93124cb-260b-4a37-8e77-638ac45db269 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.824828] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81304a5f-d136-455d-99b1-340fb8951ec9 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.833427] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1788ff1-45b3-4a80-bdb9-ed24b1a4e206 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.863266] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ac6aecb-8602-46b9-aab7-a434cc0950ce {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.873704] env[59379]: DEBUG oslo_vmware.api [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Task: {'id': task-559685, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072486} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1195.873877] env[59379]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3a83fb58-a0f1-4a37-ada8-9d0d34043652 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.875442] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1195.875620] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1195.875747] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1195.875914] env[59379]: INFO nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1195.878507] env[59379]: DEBUG nova.compute.claims [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1195.878670] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1195.878870] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1195.895718] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Downloading image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1195.905017] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1195.905710] env[59379]: DEBUG nova.compute.utils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Instance 9745bc90-6927-46a9-af48-df69046dc2a2 could not be found. {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1195.907359] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Instance disappeared during build. {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1195.907521] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1195.907675] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1195.907900] env[59379]: DEBUG nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1195.907984] env[59379]: DEBUG nova.network.neutron [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1195.941015] env[59379]: DEBUG oslo_vmware.rw_handles [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/16cc1936-9e00-4469-8b4b-440730b29025/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1195.990956] env[59379]: DEBUG nova.network.neutron [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1195.994716] env[59379]: DEBUG oslo_vmware.rw_handles [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Completed reading data from the image iterator. {{(pid=59379) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1195.994868] env[59379]: DEBUG oslo_vmware.rw_handles [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/16cc1936-9e00-4469-8b4b-440730b29025/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59379) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1196.000266] env[59379]: INFO nova.compute.manager [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] [instance: 9745bc90-6927-46a9-af48-df69046dc2a2] Took 0.09 seconds to deallocate network for instance. [ 1196.065884] env[59379]: DEBUG oslo_concurrency.lockutils [None req-9e8f10d9-f5cc-43e3-a33e-fb17a0ba8a4a tempest-ImagesOneServerTestJSON-224376027 tempest-ImagesOneServerTestJSON-224376027-project-member] Lock "9745bc90-6927-46a9-af48-df69046dc2a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 257.441s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1205.452874] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1207.433410] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1207.433772] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Starting heal instance info cache {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1207.433772] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Rebuilding the list of instances to heal {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1207.444363] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Skipping network cache update for instance because it is Building. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1207.444499] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Didn't find any instances for network info cache update. {{(pid=59379) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1207.444928] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1207.445114] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1207.445289] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1207.445420] env[59379]: DEBUG nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59379) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1208.433705] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1209.433519] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1209.443703] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1209.444097] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1209.444097] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1209.444276] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59379) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1209.445271] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-662234c9-4d09-4731-8521-23f47682aa55 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.453577] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb8ca701-a1ba-4126-9971-5bd36eb2324c {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.466721] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42e0b0b9-d261-4dee-8495-c036164d55c4 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.472498] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d023c08-c29e-4cff-8635-f712e3ad86bc {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.500466] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181771MB free_disk=101GB free_vcpus=48 pci_devices=None {{(pid=59379) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1209.500617] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1209.500765] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1209.536394] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Instance 3c337acb-ce39-44b5-a898-b40d7f4d5234 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59379) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1209.536595] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1209.536739] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59379) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1209.563598] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b07d7c49-86bd-4c94-ac96-49cca6da9416 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1209.570526] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8442d10e-c4f6-43d4-a5ec-0c56fba8f926 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1210.228801] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fecd04b-198c-44dc-99d5-8f5dcdcb2cdf {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1210.235896] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7bb50ba-5ab9-43b8-9746-b6818ffd41e5 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1210.248477] env[59379]: DEBUG nova.compute.provider_tree [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1210.256305] env[59379]: DEBUG nova.scheduler.client.report [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1210.268777] env[59379]: DEBUG nova.compute.resource_tracker [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59379) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1210.268950] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.768s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1211.270217] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1211.270616] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1241.477400] env[59379]: WARNING oslo_vmware.rw_handles [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles response.begin() [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1241.477400] env[59379]: ERROR oslo_vmware.rw_handles [ 1241.478181] env[59379]: DEBUG nova.virt.vmwareapi.images [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Downloaded image file data a816e082-61f0-4ffa-a214-1bf6bd197f53 to vmware_temp/16cc1936-9e00-4469-8b4b-440730b29025/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk on the data store datastore1 {{(pid=59379) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1241.480068] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Caching image {{(pid=59379) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1241.480357] env[59379]: DEBUG nova.virt.vmwareapi.vm_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Copying Virtual Disk [datastore1] vmware_temp/16cc1936-9e00-4469-8b4b-440730b29025/a816e082-61f0-4ffa-a214-1bf6bd197f53/tmp-sparse.vmdk to [datastore1] vmware_temp/16cc1936-9e00-4469-8b4b-440730b29025/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk {{(pid=59379) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1241.480628] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-459e3a05-f05b-4270-823b-c055a3db302d {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1241.489537] env[59379]: DEBUG oslo_vmware.api [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Waiting for the task: (returnval){ [ 1241.489537] env[59379]: value = "task-559686" [ 1241.489537] env[59379]: _type = "Task" [ 1241.489537] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1241.497226] env[59379]: DEBUG oslo_vmware.api [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Task: {'id': task-559686, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1241.999601] env[59379]: DEBUG oslo_vmware.exceptions [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Fault InvalidArgument not matched. {{(pid=59379) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1241.999851] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Releasing lock "[datastore1] devstack-image-cache_base/a816e082-61f0-4ffa-a214-1bf6bd197f53/a816e082-61f0-4ffa-a214-1bf6bd197f53.vmdk" {{(pid=59379) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1242.000387] env[59379]: ERROR nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1242.000387] env[59379]: Faults: ['InvalidArgument'] [ 1242.000387] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Traceback (most recent call last): [ 1242.000387] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1242.000387] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] yield resources [ 1242.000387] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1242.000387] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] self.driver.spawn(context, instance, image_meta, [ 1242.000387] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1242.000387] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1242.000387] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1242.000387] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] self._fetch_image_if_missing(context, vi) [ 1242.000387] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] image_cache(vi, tmp_image_ds_loc) [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] vm_util.copy_virtual_disk( [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] session._wait_for_task(vmdk_copy_task) [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] return self.wait_for_task(task_ref) [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] return evt.wait() [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] result = hub.switch() [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1242.000998] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] return self.greenlet.switch() [ 1242.001457] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1242.001457] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] self.f(*self.args, **self.kw) [ 1242.001457] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1242.001457] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] raise exceptions.translate_fault(task_info.error) [ 1242.001457] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1242.001457] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Faults: ['InvalidArgument'] [ 1242.001457] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] [ 1242.001457] env[59379]: INFO nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Terminating instance [ 1242.003365] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Start destroying the instance on the hypervisor. {{(pid=59379) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1242.003544] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Destroying instance {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1242.004261] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ae70b44-cd31-4a27-98bc-33c82b0215b3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.010322] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Unregistering the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1242.010508] env[59379]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7b2e84ab-1c84-427c-bb0e-008a7cd4c653 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.084606] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Unregistered the VM {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1242.084818] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Deleting contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1242.084956] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Deleting the datastore file [datastore1] 3c337acb-ce39-44b5-a898-b40d7f4d5234 {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1242.085206] env[59379]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d64b792d-eb9e-4e3f-a8f8-dcf17be83f69 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.092954] env[59379]: DEBUG oslo_vmware.api [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Waiting for the task: (returnval){ [ 1242.092954] env[59379]: value = "task-559688" [ 1242.092954] env[59379]: _type = "Task" [ 1242.092954] env[59379]: } to complete. {{(pid=59379) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1242.099969] env[59379]: DEBUG oslo_vmware.api [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Task: {'id': task-559688, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1242.602637] env[59379]: DEBUG oslo_vmware.api [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Task: {'id': task-559688, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07565} completed successfully. {{(pid=59379) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1242.602991] env[59379]: DEBUG nova.virt.vmwareapi.ds_util [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Deleted the datastore file {{(pid=59379) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1242.603060] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Deleted contents of the VM from datastore datastore1 {{(pid=59379) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1242.603190] env[59379]: DEBUG nova.virt.vmwareapi.vmops [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Instance destroyed {{(pid=59379) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1242.603352] env[59379]: INFO nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1242.605444] env[59379]: DEBUG nova.compute.claims [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Aborting claim: {{(pid=59379) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1242.605607] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1242.605806] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1242.665021] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffcc0df9-78bd-4fbd-9f54-38d0097a41c0 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.671794] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3a1991e-18a3-4827-9f66-338844db24df {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.700713] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f5d1a80-9c4a-42ab-980c-e86b0f87ada3 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.707396] env[59379]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18eee7fe-f966-4950-978a-698fea8ab8a1 {{(pid=59379) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1242.720106] env[59379]: DEBUG nova.compute.provider_tree [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Inventory has not changed in ProviderTree for provider: 693f1d2b-e627-44fb-bcd5-714cccac894b {{(pid=59379) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1242.728409] env[59379]: DEBUG nova.scheduler.client.report [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Inventory has not changed for provider 693f1d2b-e627-44fb-bcd5-714cccac894b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 101, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59379) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1242.741173] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.135s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1242.741677] env[59379]: ERROR nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1242.741677] env[59379]: Faults: ['InvalidArgument'] [ 1242.741677] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Traceback (most recent call last): [ 1242.741677] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1242.741677] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] self.driver.spawn(context, instance, image_meta, [ 1242.741677] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1242.741677] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1242.741677] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1242.741677] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] self._fetch_image_if_missing(context, vi) [ 1242.741677] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1242.741677] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] image_cache(vi, tmp_image_ds_loc) [ 1242.741677] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] vm_util.copy_virtual_disk( [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] session._wait_for_task(vmdk_copy_task) [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] return self.wait_for_task(task_ref) [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] return evt.wait() [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] result = hub.switch() [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] return self.greenlet.switch() [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1242.742018] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] self.f(*self.args, **self.kw) [ 1242.742499] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1242.742499] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] raise exceptions.translate_fault(task_info.error) [ 1242.742499] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1242.742499] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Faults: ['InvalidArgument'] [ 1242.742499] env[59379]: ERROR nova.compute.manager [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] [ 1242.742499] env[59379]: DEBUG nova.compute.utils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] VimFaultException {{(pid=59379) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1242.743662] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Build of instance 3c337acb-ce39-44b5-a898-b40d7f4d5234 was re-scheduled: A specified parameter was not correct: fileType [ 1242.743662] env[59379]: Faults: ['InvalidArgument'] {{(pid=59379) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1242.744071] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Unplugging VIFs for instance {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1242.744215] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59379) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1242.744381] env[59379]: DEBUG nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Deallocating network for instance {{(pid=59379) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1242.744527] env[59379]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] deallocate_for_instance() {{(pid=59379) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1242.961636] env[59379]: DEBUG nova.network.neutron [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Updating instance_info_cache with network_info: [] {{(pid=59379) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1242.973533] env[59379]: INFO nova.compute.manager [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] Took 0.23 seconds to deallocate network for instance. [ 1243.060302] env[59379]: INFO nova.scheduler.client.report [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Deleted allocations for instance 3c337acb-ce39-44b5-a898-b40d7f4d5234 [ 1243.075937] env[59379]: DEBUG oslo_concurrency.lockutils [None req-2002e003-76ff-402d-b795-06d85bfc3029 tempest-AttachVolumeShelveTestJSON-824139339 tempest-AttachVolumeShelveTestJSON-824139339-project-member] Lock "3c337acb-ce39-44b5-a898-b40d7f4d5234" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 93.640s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1243.076173] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "3c337acb-ce39-44b5-a898-b40d7f4d5234" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 49.258s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1243.076352] env[59379]: INFO nova.compute.manager [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] [instance: 3c337acb-ce39-44b5-a898-b40d7f4d5234] During sync_power_state the instance has a pending task (spawning). Skip. [ 1243.076512] env[59379]: DEBUG oslo_concurrency.lockutils [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Lock "3c337acb-ce39-44b5-a898-b40d7f4d5234" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59379) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1265.432475] env[59379]: DEBUG oslo_service.periodic_task [None req-e99e476b-f66b-4f24-9eb4-7b30126bd213 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59379) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}}