[ 480.216408] env[59333]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 480.846136] env[59382]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 482.383536] env[59382]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=59382) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 482.383861] env[59382]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=59382) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 482.383977] env[59382]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=59382) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 482.384302] env[59382]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 482.385431] env[59382]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 482.506780] env[59382]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=59382) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 482.517088] env[59382]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=59382) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 482.618050] env[59382]: INFO nova.virt.driver [None req-c95dadf4-65e5-4a1a-8f78-6d8630e9db62 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 482.691674] env[59382]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 482.691847] env[59382]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 482.691947] env[59382]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=59382) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 485.890754] env[59382]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-28053e14-dffc-4823-9f6b-f2ec8ed97a86 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 485.906572] env[59382]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=59382) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 485.906744] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-653320db-0a98-4472-b745-6e4b9055091f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 485.940897] env[59382]: INFO oslo_vmware.api [-] Successfully established new session; session ID is b1640. [ 485.941028] env[59382]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.249s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 485.941855] env[59382]: INFO nova.virt.vmwareapi.driver [None req-c95dadf4-65e5-4a1a-8f78-6d8630e9db62 None None] VMware vCenter version: 7.0.3 [ 485.945614] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60704427-c9e3-4b00-a944-f068f09454dd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 485.963325] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50a8d89c-bc98-4354-aed8-b45c96ddff9c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 485.969187] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48af14a5-08b4-4e33-8d4c-265c648e21c4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 485.975668] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c3011ef-4916-4085-827b-76e9891bc92b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 485.989597] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d05b594-a4bb-4f84-9198-44146129555c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 485.995916] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4195e3c8-abb0-4f7f-bb3c-326314d3d466 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 486.025074] env[59382]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-210e6c21-2aff-4d75-92db-02c783395c46 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 486.030260] env[59382]: DEBUG nova.virt.vmwareapi.driver [None req-c95dadf4-65e5-4a1a-8f78-6d8630e9db62 None None] Extension org.openstack.compute already exists. {{(pid=59382) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 486.032918] env[59382]: INFO nova.compute.provider_config [None req-c95dadf4-65e5-4a1a-8f78-6d8630e9db62 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 486.049607] env[59382]: DEBUG nova.context [None req-c95dadf4-65e5-4a1a-8f78-6d8630e9db62 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),6010e81a-54d3-4bed-a271-4b450707ca55(cell1) {{(pid=59382) load_cells /opt/stack/nova/nova/context.py:464}} [ 486.051459] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 486.051683] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 486.052442] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 486.052784] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Acquiring lock "6010e81a-54d3-4bed-a271-4b450707ca55" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 486.052978] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Lock "6010e81a-54d3-4bed-a271-4b450707ca55" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 486.053943] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Lock "6010e81a-54d3-4bed-a271-4b450707ca55" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 486.066848] env[59382]: DEBUG oslo_db.sqlalchemy.engines [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59382) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 486.067255] env[59382]: DEBUG oslo_db.sqlalchemy.engines [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59382) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 486.074440] env[59382]: ERROR nova.db.main.api [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 486.074440] env[59382]: result = function(*args, **kwargs) [ 486.074440] env[59382]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 486.074440] env[59382]: return func(*args, **kwargs) [ 486.074440] env[59382]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 486.074440] env[59382]: result = fn(*args, **kwargs) [ 486.074440] env[59382]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 486.074440] env[59382]: return f(*args, **kwargs) [ 486.074440] env[59382]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 486.074440] env[59382]: return db.service_get_minimum_version(context, binaries) [ 486.074440] env[59382]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 486.074440] env[59382]: _check_db_access() [ 486.074440] env[59382]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 486.074440] env[59382]: stacktrace = ''.join(traceback.format_stack()) [ 486.074440] env[59382]: [ 486.075767] env[59382]: ERROR nova.db.main.api [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 486.075767] env[59382]: result = function(*args, **kwargs) [ 486.075767] env[59382]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 486.075767] env[59382]: return func(*args, **kwargs) [ 486.075767] env[59382]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 486.075767] env[59382]: result = fn(*args, **kwargs) [ 486.075767] env[59382]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 486.075767] env[59382]: return f(*args, **kwargs) [ 486.075767] env[59382]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 486.075767] env[59382]: return db.service_get_minimum_version(context, binaries) [ 486.075767] env[59382]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 486.075767] env[59382]: _check_db_access() [ 486.075767] env[59382]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 486.075767] env[59382]: stacktrace = ''.join(traceback.format_stack()) [ 486.075767] env[59382]: [ 486.076387] env[59382]: WARNING nova.objects.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Failed to get minimum service version for cell 6010e81a-54d3-4bed-a271-4b450707ca55 [ 486.076387] env[59382]: WARNING nova.objects.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 486.076716] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Acquiring lock "singleton_lock" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 486.076877] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Acquired lock "singleton_lock" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 486.077139] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Releasing lock "singleton_lock" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 486.077474] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Full set of CONF: {{(pid=59382) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 486.077617] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ******************************************************************************** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 486.077744] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] Configuration options gathered from: {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 486.077875] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 486.078085] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 486.078216] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ================================================================================ {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 486.078424] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] allow_resize_to_same_host = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.078595] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] arq_binding_timeout = 300 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.078732] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] backdoor_port = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.078851] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] backdoor_socket = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.079017] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] block_device_allocate_retries = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.079214] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] block_device_allocate_retries_interval = 3 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.079393] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cert = self.pem {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.079560] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.079728] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute_monitors = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.079891] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] config_dir = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.080073] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] config_drive_format = iso9660 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.080214] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.080379] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] config_source = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.080545] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] console_host = devstack {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.080710] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] control_exchange = nova {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.080871] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cpu_allocation_ratio = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.081046] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] daemon = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.081223] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] debug = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.081381] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] default_access_ip_network_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.081546] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] default_availability_zone = nova {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.081700] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] default_ephemeral_format = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.081935] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.082138] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] default_schedule_zone = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.082313] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] disk_allocation_ratio = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.082476] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] enable_new_services = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.082653] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] enabled_apis = ['osapi_compute'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.082816] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] enabled_ssl_apis = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.082979] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] flat_injected = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.083153] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] force_config_drive = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.083346] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] force_raw_images = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.083483] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] graceful_shutdown_timeout = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.083645] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] heal_instance_info_cache_interval = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.083858] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] host = cpu-1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.084059] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.084242] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] initial_disk_allocation_ratio = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.084408] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] initial_ram_allocation_ratio = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.084622] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.084789] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] instance_build_timeout = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.084953] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] instance_delete_interval = 300 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.085197] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] instance_format = [instance: %(uuid)s] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.085389] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] instance_name_template = instance-%08x {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.085555] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] instance_usage_audit = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.085728] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] instance_usage_audit_period = month {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.085896] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.086076] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] instances_path = /opt/stack/data/nova/instances {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.086255] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] internal_service_availability_zone = internal {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.086416] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] key = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.086578] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] live_migration_retry_count = 30 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.086740] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] log_config_append = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.086905] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.087075] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] log_dir = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.087239] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] log_file = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.087368] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] log_options = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.087533] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] log_rotate_interval = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.087701] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] log_rotate_interval_type = days {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.087869] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] log_rotation_type = none {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.088000] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.088150] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.088320] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.088484] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.088610] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.088772] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] long_rpc_timeout = 1800 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.088932] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] max_concurrent_builds = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.089103] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] max_concurrent_live_migrations = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.089266] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] max_concurrent_snapshots = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.089424] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] max_local_block_devices = 3 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.089579] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] max_logfile_count = 30 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.089736] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] max_logfile_size_mb = 200 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.089893] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] maximum_instance_delete_attempts = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.090071] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] metadata_listen = 0.0.0.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.090243] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] metadata_listen_port = 8775 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.090410] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] metadata_workers = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.090569] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] migrate_max_retries = -1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.090733] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] mkisofs_cmd = genisoimage {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.090936] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] my_block_storage_ip = 10.180.1.21 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.091079] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] my_ip = 10.180.1.21 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.091272] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] network_allocate_retries = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.091456] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.091621] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] osapi_compute_listen = 0.0.0.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.091782] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] osapi_compute_listen_port = 8774 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.091946] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] osapi_compute_unique_server_name_scope = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.092129] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] osapi_compute_workers = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.092294] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] password_length = 12 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.092452] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] periodic_enable = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.092611] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] periodic_fuzzy_delay = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.092775] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] pointer_model = usbtablet {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.092942] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] preallocate_images = none {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.093128] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] publish_errors = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.093286] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] pybasedir = /opt/stack/nova {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.093467] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ram_allocation_ratio = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.093631] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] rate_limit_burst = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.093796] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] rate_limit_except_level = CRITICAL {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.093955] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] rate_limit_interval = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.094156] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] reboot_timeout = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.094325] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] reclaim_instance_interval = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.094484] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] record = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.094755] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] reimage_timeout_per_gb = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.094967] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] report_interval = 120 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.095176] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] rescue_timeout = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.095348] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] reserved_host_cpus = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.095510] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] reserved_host_disk_mb = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.095669] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] reserved_host_memory_mb = 512 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.095827] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] reserved_huge_pages = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.095987] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] resize_confirm_window = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.096160] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] resize_fs_using_block_device = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.096321] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] resume_guests_state_on_host_boot = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.096486] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.096646] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] rpc_response_timeout = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.096803] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] run_external_periodic_tasks = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.096970] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] running_deleted_instance_action = reap {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.097143] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] running_deleted_instance_poll_interval = 1800 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.097303] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] running_deleted_instance_timeout = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.097465] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler_instance_sync_interval = 120 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.097600] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_down_time = 300 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.097777] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] servicegroup_driver = db {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.098053] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] shelved_offload_time = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.098340] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] shelved_poll_interval = 3600 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.098621] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] shutdown_timeout = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.098903] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] source_is_ipv6 = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.099198] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ssl_only = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.099547] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.099733] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] sync_power_state_interval = 600 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.099904] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] sync_power_state_pool_size = 1000 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.100388] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] syslog_log_facility = LOG_USER {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.100388] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] tempdir = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.100453] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] timeout_nbd = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.100573] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] transport_url = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.100738] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] update_resources_interval = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.100899] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] use_cow_images = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.101072] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] use_eventlog = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.101238] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] use_journal = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.101399] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] use_json = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.101558] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] use_rootwrap_daemon = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.101716] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] use_stderr = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.101875] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] use_syslog = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.102045] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vcpu_pin_set = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.102221] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plugging_is_fatal = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.102390] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plugging_timeout = 300 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.102557] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] virt_mkfs = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.102720] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] volume_usage_poll_interval = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.102880] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] watch_log_file = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.103058] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] web = /usr/share/spice-html5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 486.103281] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_concurrency.disable_process_locking = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.103603] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.103792] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.103963] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.104177] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.104358] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.104525] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.104710] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.auth_strategy = keystone {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.104882] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.compute_link_prefix = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.105079] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.105292] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.dhcp_domain = novalocal {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.105469] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.enable_instance_password = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.105635] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.glance_link_prefix = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.105802] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.105976] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.106160] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.instance_list_per_project_cells = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.106327] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.list_records_by_skipping_down_cells = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.106492] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.local_metadata_per_cell = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.106660] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.max_limit = 1000 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.106827] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.metadata_cache_expiration = 15 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.107007] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.neutron_default_tenant_id = default {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.107185] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.use_forwarded_for = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.107351] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.use_neutron_default_nets = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.107520] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.107683] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.107851] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.108035] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.108214] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.vendordata_dynamic_targets = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.108385] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.vendordata_jsonfile_path = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.108568] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.108761] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.backend = dogpile.cache.memcached {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.108928] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.backend_argument = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.109111] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.config_prefix = cache.oslo {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.109286] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.dead_timeout = 60.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.109462] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.debug_cache_backend = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.109627] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.enable_retry_client = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.109792] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.enable_socket_keepalive = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.109963] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.enabled = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.110145] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.expiration_time = 600 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.110315] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.hashclient_retry_attempts = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.110474] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.hashclient_retry_delay = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.110636] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.memcache_dead_retry = 300 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.110802] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.memcache_password = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.110967] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.111143] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.111314] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.memcache_pool_maxsize = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.111474] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.111635] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.memcache_sasl_enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.111814] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.111981] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.memcache_socket_timeout = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.112171] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.memcache_username = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.112343] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.proxies = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.112511] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.retry_attempts = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.112678] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.retry_delay = 0.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.112841] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.socket_keepalive_count = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.113010] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.socket_keepalive_idle = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.113180] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.socket_keepalive_interval = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.113337] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.tls_allowed_ciphers = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.113493] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.tls_cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.113648] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.tls_certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.113807] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.tls_enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.113960] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cache.tls_keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.114171] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.auth_section = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.114350] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.auth_type = password {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.114514] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.114686] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.catalog_info = volumev3::publicURL {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.114847] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.115019] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.115210] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.cross_az_attach = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.115382] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.debug = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.115542] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.endpoint_template = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.115705] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.http_retries = 3 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.115863] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.116030] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.116208] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.os_region_name = RegionOne {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.116370] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.116527] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cinder.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.116698] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.116857] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.cpu_dedicated_set = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.117026] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.cpu_shared_set = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.117186] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.image_type_exclude_list = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.117357] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.117522] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.max_concurrent_disk_ops = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.117681] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.max_disk_devices_to_attach = -1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.117838] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.118013] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.118183] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.resource_provider_association_refresh = 300 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.118346] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.shutdown_retry_interval = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.118523] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.118699] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] conductor.workers = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.118871] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] console.allowed_origins = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.119040] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] console.ssl_ciphers = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.119217] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] console.ssl_minimum_version = default {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.119392] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] consoleauth.token_ttl = 600 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.119554] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.119710] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.119873] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.120043] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.connect_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.120210] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.connect_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.120368] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.endpoint_override = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.120530] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.120687] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.120846] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.max_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.121020] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.min_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.121183] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.region_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.121343] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.service_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.121513] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.service_type = accelerator {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.121675] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.121835] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.status_code_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.121999] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.status_code_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.122175] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.122359] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.122521] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] cyborg.version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.122704] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.backend = sqlalchemy {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.122882] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.connection = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.123065] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.connection_debug = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.123240] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.connection_parameters = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.123405] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.connection_recycle_time = 3600 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.123574] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.connection_trace = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.123738] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.db_inc_retry_interval = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.123903] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.db_max_retries = 20 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.124102] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.db_max_retry_interval = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.124283] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.db_retry_interval = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.124462] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.max_overflow = 50 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.124627] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.max_pool_size = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.124800] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.max_retries = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.124963] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.mysql_enable_ndb = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.125174] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.125355] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.mysql_wsrep_sync_wait = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.125522] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.pool_timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.125695] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.retry_interval = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.125857] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.slave_connection = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.126036] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.sqlite_synchronous = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.126208] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] database.use_db_reconnect = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.126393] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.backend = sqlalchemy {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.128091] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.connection = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.128297] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.connection_debug = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.128482] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.connection_parameters = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.128656] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.connection_recycle_time = 3600 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.128832] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.connection_trace = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.129012] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.db_inc_retry_interval = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.129189] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.db_max_retries = 20 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.129359] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.db_max_retry_interval = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.129524] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.db_retry_interval = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.129696] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.max_overflow = 50 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.129861] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.max_pool_size = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.130042] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.max_retries = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.130218] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.mysql_enable_ndb = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.130389] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.130550] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.130717] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.pool_timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.130889] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.retry_interval = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.131063] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.slave_connection = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.131239] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] api_database.sqlite_synchronous = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.131418] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] devices.enabled_mdev_types = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.131598] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.131762] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ephemeral_storage_encryption.enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.131929] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.132114] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.api_servers = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.132284] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.132450] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.132615] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.132774] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.connect_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.132934] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.connect_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.133113] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.debug = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.133289] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.default_trusted_certificate_ids = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.133456] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.enable_certificate_validation = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.133621] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.enable_rbd_download = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.133832] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.endpoint_override = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.133949] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.134154] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.134327] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.max_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.134490] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.min_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.134659] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.num_retries = 3 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.134834] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.rbd_ceph_conf = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.135016] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.rbd_connect_timeout = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.135256] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.rbd_pool = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.135460] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.rbd_user = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.135632] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.region_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.135796] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.service_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.135969] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.service_type = image {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.136151] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.136319] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.status_code_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.136480] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.status_code_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.136642] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.136827] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.136994] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.verify_glance_signatures = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.137173] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] glance.version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.137345] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] guestfs.debug = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.137520] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.config_drive_cdrom = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.137685] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.config_drive_inject_password = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.137853] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.138030] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.enable_instance_metrics_collection = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.138206] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.enable_remotefx = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.138379] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.instances_path_share = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.138547] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.iscsi_initiator_list = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.138711] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.limit_cpu_features = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.138877] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.139054] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.139235] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.power_state_check_timeframe = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.139399] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.139572] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.139737] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.use_multipath_io = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.139902] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.volume_attach_retry_count = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.140078] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.140250] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.vswitch_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.140416] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.140588] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] mks.enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.140978] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.141165] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] image_cache.manager_interval = 2400 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.141343] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] image_cache.precache_concurrency = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.141518] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] image_cache.remove_unused_base_images = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.141690] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.141861] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.142052] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] image_cache.subdirectory_name = _base {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.142240] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.api_max_retries = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.142407] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.api_retry_interval = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.142567] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.auth_section = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.142730] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.auth_type = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.142889] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.143058] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.143228] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.143387] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.connect_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.143545] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.connect_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.143703] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.endpoint_override = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.143865] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.144043] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.144217] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.max_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.144380] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.min_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.144541] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.partition_key = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.144705] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.peer_list = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.144863] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.region_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.145041] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.serial_console_state_timeout = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.145229] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.service_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.145406] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.service_type = baremetal {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.145565] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.145723] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.status_code_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.145879] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.status_code_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.146049] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.146240] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.146400] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ironic.version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.146580] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.146752] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] key_manager.fixed_key = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.146934] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.147106] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.barbican_api_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.147269] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.barbican_endpoint = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.147436] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.barbican_endpoint_type = public {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.147596] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.barbican_region_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.147940] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.147940] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.148069] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.148236] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.148410] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.148575] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.number_of_retries = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.148734] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.retry_delay = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.148900] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.send_service_user_token = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.149077] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.149243] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.149407] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.verify_ssl = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.149567] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican.verify_ssl_path = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.149735] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican_service_user.auth_section = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.149898] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican_service_user.auth_type = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.150069] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican_service_user.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.150233] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican_service_user.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.150395] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican_service_user.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.150554] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican_service_user.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.150709] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican_service_user.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.150869] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican_service_user.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.151035] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] barbican_service_user.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.151210] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.approle_role_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.151369] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.approle_secret_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.151525] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.151680] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.151842] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.152009] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.152174] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.152343] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.kv_mountpoint = secret {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.152506] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.kv_version = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.152667] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.namespace = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.152823] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.root_token_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.152982] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.153154] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.ssl_ca_crt_file = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.153314] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.153475] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.use_ssl = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.153643] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.153807] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.153967] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.154170] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.154338] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.connect_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.154497] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.connect_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.154656] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.endpoint_override = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.154818] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.154977] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.155168] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.max_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.155339] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.min_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.155498] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.region_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.155656] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.service_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.155827] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.service_type = identity {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.155991] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.156166] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.status_code_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.156329] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.status_code_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.156489] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.156670] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.156833] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] keystone.version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.157046] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.connection_uri = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.157219] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.cpu_mode = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.157390] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.cpu_model_extra_flags = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.157560] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.cpu_models = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.157730] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.cpu_power_governor_high = performance {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.157896] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.cpu_power_governor_low = powersave {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.158069] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.cpu_power_management = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.158250] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.158414] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.device_detach_attempts = 8 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.158578] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.device_detach_timeout = 20 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.158744] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.disk_cachemodes = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.158903] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.disk_prefix = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.159077] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.enabled_perf_events = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.159245] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.file_backed_memory = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.159409] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.gid_maps = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.159600] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.hw_disk_discard = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.159719] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.hw_machine_type = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.159888] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.images_rbd_ceph_conf = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.160065] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.160240] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.160410] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.images_rbd_glance_store_name = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.160578] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.images_rbd_pool = rbd {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.160747] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.images_type = default {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.160905] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.images_volume_group = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.161077] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.inject_key = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.161245] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.inject_partition = -2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.161405] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.inject_password = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.161568] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.iscsi_iface = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.161731] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.iser_use_multipath = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.161895] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_bandwidth = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.162073] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.162242] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_downtime = 500 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.162406] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.162567] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.162726] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_inbound_addr = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.162886] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.163056] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_permit_post_copy = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.163234] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_scheme = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.163412] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_timeout_action = abort {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.163581] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_tunnelled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.163744] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_uri = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.163909] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.live_migration_with_native_tls = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.164104] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.max_queues = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.164285] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.164447] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.nfs_mount_options = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.164761] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.164938] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.165138] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.num_iser_scan_tries = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.165321] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.num_memory_encrypted_guests = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.165491] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.165657] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.num_pcie_ports = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.165826] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.num_volume_scan_tries = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.165994] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.pmem_namespaces = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.166172] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.quobyte_client_cfg = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.166465] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.166642] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.rbd_connect_timeout = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.166811] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.166975] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.167156] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.rbd_secret_uuid = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.167320] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.rbd_user = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.167485] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.167658] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.remote_filesystem_transport = ssh {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.167819] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.rescue_image_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.167979] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.rescue_kernel_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.168151] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.rescue_ramdisk_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.168325] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.168486] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.rx_queue_size = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.168656] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.smbfs_mount_options = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.168933] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.169121] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.snapshot_compression = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.169287] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.snapshot_image_format = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.169510] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.169696] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.sparse_logical_volumes = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.169841] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.swtpm_enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.170019] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.swtpm_group = tss {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.170194] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.swtpm_user = tss {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.170366] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.sysinfo_serial = unique {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.170526] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.tx_queue_size = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.170692] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.uid_maps = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.170854] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.use_virtio_for_bridges = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.171035] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.virt_type = kvm {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.171211] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.volume_clear = zero {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.171373] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.volume_clear_size = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.171541] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.volume_use_multipath = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.171700] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.vzstorage_cache_path = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.171869] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.172048] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.vzstorage_mount_group = qemu {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.172222] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.vzstorage_mount_opts = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.172392] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.172664] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.172841] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.vzstorage_mount_user = stack {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.173023] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.173195] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.auth_section = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.173374] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.auth_type = password {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.173538] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.173698] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.173862] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.174053] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.connect_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.174230] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.connect_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.174406] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.default_floating_pool = public {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.174565] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.endpoint_override = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.174728] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.extension_sync_interval = 600 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.174889] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.http_retries = 3 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.175079] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.175263] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.175425] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.max_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.175595] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.175754] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.min_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.175921] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.ovs_bridge = br-int {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.176099] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.physnets = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.176275] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.region_name = RegionOne {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.176444] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.service_metadata_proxy = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.176604] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.service_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.176769] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.service_type = network {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.176929] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.177101] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.status_code_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.177262] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.status_code_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.177421] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.177602] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.177764] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] neutron.version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.177934] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] notifications.bdms_in_notifications = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.178124] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] notifications.default_level = INFO {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.178303] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] notifications.notification_format = unversioned {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.178467] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] notifications.notify_on_state_change = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.178642] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.178819] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] pci.alias = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.178988] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] pci.device_spec = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.179169] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] pci.report_in_placement = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.179344] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.auth_section = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.179516] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.auth_type = password {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.179685] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.179845] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.180007] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.180179] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.180338] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.connect_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.180494] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.connect_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.180649] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.default_domain_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.180804] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.default_domain_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.180959] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.domain_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.181127] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.domain_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.181286] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.endpoint_override = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.181445] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.181600] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.181755] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.max_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.181908] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.min_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.182087] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.password = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.182253] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.project_domain_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.182419] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.project_domain_name = Default {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.182585] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.project_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.182757] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.project_name = service {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.182924] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.region_name = RegionOne {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.183095] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.service_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.183269] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.service_type = placement {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.183433] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.183591] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.status_code_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.183750] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.status_code_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.183910] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.system_scope = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.184101] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.184277] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.trust_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.184438] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.user_domain_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.184606] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.user_domain_name = Default {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.184769] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.user_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.184943] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.username = placement {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.185224] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.185422] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] placement.version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.185608] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.cores = 20 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.185777] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.count_usage_from_placement = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.185951] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.186146] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.injected_file_content_bytes = 10240 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.186321] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.injected_file_path_length = 255 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.186488] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.injected_files = 5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.186656] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.instances = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.186822] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.key_pairs = 100 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.186988] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.metadata_items = 128 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.187190] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.ram = 51200 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.187434] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.recheck_quota = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.187621] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.server_group_members = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.187792] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] quota.server_groups = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.187964] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] rdp.enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.188290] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.188486] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.188658] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.188824] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.image_metadata_prefilter = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.188989] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.189204] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.max_attempts = 3 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.189397] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.max_placement_results = 1000 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.189568] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.189734] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.query_placement_for_availability_zone = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.189900] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.query_placement_for_image_type_support = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.190077] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.190260] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] scheduler.workers = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.190482] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.190762] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.191087] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.191371] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.191626] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.191817] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.191994] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.192206] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.192384] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.host_subset_size = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.192551] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.192721] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.192889] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.isolated_hosts = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.193068] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.isolated_images = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.193241] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.193405] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.193568] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.pci_in_placement = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.193736] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.193901] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.194099] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.194276] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.194443] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.194614] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.194780] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.track_instance_changes = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.194960] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.195182] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] metrics.required = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.195361] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] metrics.weight_multiplier = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.195529] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.195697] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] metrics.weight_setting = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.196007] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.196194] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] serial_console.enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.196376] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] serial_console.port_range = 10000:20000 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.196551] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.196722] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.196890] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] serial_console.serialproxy_port = 6083 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.197082] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_user.auth_section = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.197323] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_user.auth_type = password {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.197499] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_user.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.197664] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_user.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.197828] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_user.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.197992] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_user.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.198170] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_user.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.198345] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_user.send_service_user_token = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.198512] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_user.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.198671] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] service_user.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.198843] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.agent_enabled = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.199031] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.199338] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.199537] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.199714] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.html5proxy_port = 6082 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.199879] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.image_compression = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.200053] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.jpeg_compression = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.200221] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.playback_compression = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.200394] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.server_listen = 127.0.0.1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.200565] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.200728] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.streaming_mode = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.200885] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] spice.zlib_compression = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.201063] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] upgrade_levels.baseapi = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.201232] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] upgrade_levels.cert = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.201405] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] upgrade_levels.compute = auto {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.201566] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] upgrade_levels.conductor = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.201732] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] upgrade_levels.scheduler = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.201904] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vendordata_dynamic_auth.auth_section = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.202082] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vendordata_dynamic_auth.auth_type = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.202247] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vendordata_dynamic_auth.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.202407] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vendordata_dynamic_auth.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.202568] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.202728] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vendordata_dynamic_auth.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.202886] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vendordata_dynamic_auth.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.203059] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.203224] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vendordata_dynamic_auth.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.203400] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.api_retry_count = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.203564] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.ca_file = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.203736] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.cache_prefix = devstack-image-cache {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.203904] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.cluster_name = testcl1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.204098] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.connection_pool_size = 10 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.204274] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.console_delay_seconds = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.204446] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.datastore_regex = ^datastore.* {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.204653] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.204829] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.host_password = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.205018] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.host_port = 443 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.205211] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.host_username = administrator@vsphere.local {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.205387] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.insecure = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.205552] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.integration_bridge = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.205717] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.maximum_objects = 100 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.205876] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.pbm_default_policy = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.206049] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.pbm_enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.206218] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.pbm_wsdl_location = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.206389] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.206546] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.serial_port_proxy_uri = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.206704] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.serial_port_service_uri = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.206871] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.task_poll_interval = 0.5 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.207054] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.use_linked_clone = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.207230] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.vnc_keymap = en-us {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.207396] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.vnc_port = 5900 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.207563] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vmware.vnc_port_total = 10000 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.207747] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vnc.auth_schemes = ['none'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.207925] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vnc.enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.208228] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.208420] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.208596] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vnc.novncproxy_port = 6080 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.208774] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vnc.server_listen = 127.0.0.1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.208948] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.209150] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vnc.vencrypt_ca_certs = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.209341] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vnc.vencrypt_client_cert = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.209507] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vnc.vencrypt_client_key = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.209689] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.209856] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.disable_deep_image_inspection = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.210057] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.210206] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.210373] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.210536] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.disable_rootwrap = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.210699] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.enable_numa_live_migration = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.210860] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.211032] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.211203] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.211364] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.libvirt_disable_apic = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.211526] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.211689] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.211852] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.212024] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.212195] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.212358] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.212522] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.212681] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.212841] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.213019] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.213209] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.213381] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.client_socket_timeout = 900 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.213548] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.default_pool_size = 1000 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.213717] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.keep_alive = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.213884] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.max_header_line = 16384 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.214078] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.secure_proxy_ssl_header = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.214256] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.ssl_ca_file = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.214421] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.ssl_cert_file = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.214581] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.ssl_key_file = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.214747] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.tcp_keepidle = 600 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.214921] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.215119] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] zvm.ca_file = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.215293] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] zvm.cloud_connector_url = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.215579] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.215757] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] zvm.reachable_timeout = 300 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.215940] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_policy.enforce_new_defaults = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.216129] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_policy.enforce_scope = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.216311] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_policy.policy_default_rule = default {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.216491] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.216664] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_policy.policy_file = policy.yaml {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.216838] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.216998] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.217173] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.217336] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.217499] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.217667] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.217847] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.218033] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] profiler.connection_string = messaging:// {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.218212] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] profiler.enabled = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.218382] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] profiler.es_doc_type = notification {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.218547] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] profiler.es_scroll_size = 10000 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.218715] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] profiler.es_scroll_time = 2m {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.218878] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] profiler.filter_error_trace = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.219056] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] profiler.hmac_keys = SECRET_KEY {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.219229] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] profiler.sentinel_service_name = mymaster {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.219400] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] profiler.socket_timeout = 0.1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.219561] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] profiler.trace_sqlalchemy = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.219727] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] remote_debug.host = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.219885] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] remote_debug.port = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.220074] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.220247] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.220412] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.220573] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.220734] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.220895] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.221068] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.221267] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.221445] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.221606] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.221782] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.221954] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.222144] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.222317] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.222482] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.222658] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.222822] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.222986] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.223167] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.223338] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.223502] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.223672] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.223835] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.224011] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.224215] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.224389] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.ssl = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.224567] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.224742] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.224908] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.225103] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.225286] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_rabbit.ssl_version = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.225479] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.225648] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_notifications.retry = -1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.225831] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.226015] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_messaging_notifications.transport_url = **** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.226200] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.auth_section = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.226365] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.auth_type = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.226524] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.cafile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.226682] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.certfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.226847] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.collect_timing = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.227014] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.connect_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.227184] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.connect_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.227342] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.endpoint_id = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.227501] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.endpoint_override = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.227664] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.insecure = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.227820] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.keyfile = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.227976] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.max_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.228166] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.min_version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.228341] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.region_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.228502] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.service_name = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.228661] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.service_type = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.228824] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.split_loggers = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.228983] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.status_code_retries = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.229161] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.status_code_retry_delay = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.229322] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.timeout = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.229479] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.valid_interfaces = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.229635] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_limit.version = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.229802] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_reports.file_event_handler = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.229974] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.230144] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] oslo_reports.log_dir = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.230325] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.230484] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.230643] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.230812] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.230977] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.231149] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.231321] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.231484] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_ovs_privileged.group = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.231644] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.231810] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.231972] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.232170] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] vif_plug_ovs_privileged.user = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.232350] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_linux_bridge.flat_interface = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.232530] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.232705] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.232877] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.233059] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.233234] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.233403] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.233567] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.233745] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_ovs.isolate_vif = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.233914] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.234118] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.234308] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.234506] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_ovs.ovsdb_interface = native {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.234682] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_vif_ovs.per_port_bridge = False {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.234850] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_brick.lock_path = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.235028] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.235200] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.235372] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] privsep_osbrick.capabilities = [21] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.235531] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] privsep_osbrick.group = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.235689] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] privsep_osbrick.helper_command = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.235852] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.236025] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.236194] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] privsep_osbrick.user = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.236365] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.236521] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] nova_sys_admin.group = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.236675] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] nova_sys_admin.helper_command = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.236837] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.236999] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.237171] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] nova_sys_admin.user = None {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 486.237303] env[59382]: DEBUG oslo_service.service [None req-01e56d68-13bc-45c1-86bb-78b4f340c288 None None] ******************************************************************************** {{(pid=59382) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 486.237737] env[59382]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 486.246189] env[59382]: INFO nova.virt.node [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Generated node identity 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 [ 486.246441] env[59382]: INFO nova.virt.node [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Wrote node identity 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 to /opt/stack/data/n-cpu-1/compute_id [ 486.257852] env[59382]: WARNING nova.compute.manager [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Compute nodes ['0ed62ac0-b25e-450c-a6ea-1ad3f7977975'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 486.289087] env[59382]: INFO nova.compute.manager [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 486.310291] env[59382]: WARNING nova.compute.manager [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 486.310536] env[59382]: DEBUG oslo_concurrency.lockutils [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 486.310756] env[59382]: DEBUG oslo_concurrency.lockutils [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 486.310902] env[59382]: DEBUG oslo_concurrency.lockutils [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 486.311082] env[59382]: DEBUG nova.compute.resource_tracker [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 486.312180] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9dedf53-f7ca-4c2b-b901-efd7c8a24a12 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 486.321112] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-279243b2-d923-4c59-86b8-80fb58462e73 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 486.335098] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7948eb20-92fd-47f5-80dd-f74ac525e290 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 486.341126] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef539267-484f-4ee5-82fb-4f456fd471a7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 486.369536] env[59382]: DEBUG nova.compute.resource_tracker [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181245MB free_disk=170GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 486.369680] env[59382]: DEBUG oslo_concurrency.lockutils [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 486.369859] env[59382]: DEBUG oslo_concurrency.lockutils [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 486.381161] env[59382]: WARNING nova.compute.resource_tracker [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] No compute node record for cpu-1:0ed62ac0-b25e-450c-a6ea-1ad3f7977975: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 could not be found. [ 486.393531] env[59382]: INFO nova.compute.resource_tracker [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 [ 486.442896] env[59382]: DEBUG nova.compute.resource_tracker [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 486.443114] env[59382]: DEBUG nova.compute.resource_tracker [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 486.571274] env[59382]: INFO nova.scheduler.client.report [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] [req-7736b50a-404c-4f97-ba46-c53515af4bbb] Created resource provider record via placement API for resource provider with UUID 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 486.591858] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a76afcb-c380-43e9-85df-d7f530b62020 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 486.599890] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c21eef6f-a2e4-4372-bf03-a6ab4eda094b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 486.628408] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3af6a79a-3f52-4e27-9d47-4a102af24876 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 486.635163] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3614051a-a767-4f6e-81b1-ea9383c682bf {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 486.647929] env[59382]: DEBUG nova.compute.provider_tree [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Updating inventory in ProviderTree for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 486.686308] env[59382]: DEBUG nova.scheduler.client.report [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Updated inventory for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 486.686562] env[59382]: DEBUG nova.compute.provider_tree [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Updating resource provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 generation from 0 to 1 during operation: update_inventory {{(pid=59382) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 486.686706] env[59382]: DEBUG nova.compute.provider_tree [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Updating inventory in ProviderTree for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 486.728326] env[59382]: DEBUG nova.compute.provider_tree [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Updating resource provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 generation from 1 to 2 during operation: update_traits {{(pid=59382) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 486.745621] env[59382]: DEBUG nova.compute.resource_tracker [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 486.745814] env[59382]: DEBUG oslo_concurrency.lockutils [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 486.745978] env[59382]: DEBUG nova.service [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Creating RPC server for service compute {{(pid=59382) start /opt/stack/nova/nova/service.py:182}} [ 486.760490] env[59382]: DEBUG nova.service [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] Join ServiceGroup membership for this service compute {{(pid=59382) start /opt/stack/nova/nova/service.py:199}} [ 486.760685] env[59382]: DEBUG nova.servicegroup.drivers.db [None req-4547915b-f005-4d4b-ad06-5ff399421c6c None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=59382) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 531.587027] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Acquiring lock "f1ce8104-72de-488b-8a41-379978af0f54" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 531.587027] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Lock "f1ce8104-72de-488b-8a41-379978af0f54" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 531.609225] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 531.723777] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 531.724090] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 531.728511] env[59382]: INFO nova.compute.claims [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 531.857102] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-931a5c15-806e-4477-86b1-0ba881061f8c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 531.865552] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c76f7395-bea3-48f7-9b69-0b19cc5be6ec {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 531.905295] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afd8aadb-3c56-4c58-93fe-07b1a37d64dd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 531.915059] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03d73b89-45f7-4927-9bd9-e9e3bd3da8fe {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 531.937569] env[59382]: DEBUG nova.compute.provider_tree [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 531.957578] env[59382]: DEBUG nova.scheduler.client.report [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 531.976543] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 531.977150] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 532.027999] env[59382]: DEBUG nova.compute.utils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 532.032039] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 532.032312] env[59382]: DEBUG nova.network.neutron [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 532.053568] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 532.148645] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 532.407964] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 532.409673] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 532.409673] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 532.409673] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 532.409673] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 532.409673] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 532.409995] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 532.410260] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 532.410935] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 532.411291] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 532.411512] env[59382]: DEBUG nova.virt.hardware [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 532.412817] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ebf6f5f-1705-4e95-a40f-3fcc1fad0faa {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.421607] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c2efc2f-c847-4396-9df4-6a3af8ced742 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.447403] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1009d221-0f11-4588-9620-7725b0b967f5 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.703591] env[59382]: DEBUG nova.policy [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '639ad1a4c01a494d9bb6a961535f09cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9048236f0e6743cf87c0ffa5addd17db', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 533.426748] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Acquiring lock "4c529a26-0160-441b-b46c-7e794b079249" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.426748] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Lock "4c529a26-0160-441b-b46c-7e794b079249" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.440489] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 533.496585] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 533.496585] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 533.496585] env[59382]: INFO nova.compute.claims [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 533.615278] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18a73cf7-ba05-4e2e-ad09-b962582f7664 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 533.626687] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c646b0e9-1083-4c50-9b9e-f9f4c0edb1ef {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 533.660558] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09ef35c9-78cb-424e-9bad-da367243acb7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 533.668402] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1290e36-3109-4f7f-b50b-31d9cf2180f6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 533.683642] env[59382]: DEBUG nova.compute.provider_tree [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 533.699174] env[59382]: DEBUG nova.scheduler.client.report [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 533.719721] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 533.719987] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 533.765099] env[59382]: DEBUG nova.compute.utils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 533.766494] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 533.766725] env[59382]: DEBUG nova.network.neutron [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 533.791697] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 533.877165] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 533.915305] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 533.915305] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 533.915305] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 533.915585] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 533.915585] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 533.915585] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 533.915585] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 533.915585] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 533.915741] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 533.915741] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 533.915741] env[59382]: DEBUG nova.virt.hardware [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 533.918679] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87a028f8-2fce-4160-a1da-cdfb1714de43 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 533.928404] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0747ae7-d1f5-4280-abdf-c1f0e6648837 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 534.041868] env[59382]: DEBUG nova.policy [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8f561a9bda6a43ef98c77259ef8bb1ad', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'add015e8c0cc413eb4a4d92da82522c4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 534.129768] env[59382]: DEBUG nova.network.neutron [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Successfully created port: ad06bca7-13a9-4af5-a4df-2aa66f46989c {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 534.999053] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquiring lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 534.999388] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 535.008551] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 535.069193] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 535.069193] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 535.069193] env[59382]: INFO nova.compute.claims [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 535.189024] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e9ef9a0-406a-4886-b864-9e4d41ad0c5e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 535.198171] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b0a7f5f-7424-4234-a310-e0843c9a5b7c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 535.233391] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7069aa4d-1681-48d0-956d-b46ecd4fcd06 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 535.242399] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aad7040-cef9-4e9e-b4cc-b1882f917eba {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 535.263034] env[59382]: DEBUG nova.compute.provider_tree [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 535.276976] env[59382]: DEBUG nova.scheduler.client.report [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 535.302122] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 535.302624] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 535.357140] env[59382]: DEBUG nova.network.neutron [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Successfully created port: 5ebb8bf4-ced4-4546-88a7-a13820f0ffb7 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 535.380289] env[59382]: DEBUG nova.compute.utils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 535.380289] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 535.380434] env[59382]: DEBUG nova.network.neutron [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 535.399597] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 535.487710] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 535.510934] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 535.511163] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 535.511347] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 535.511517] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 535.511635] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 535.511809] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 535.511974] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 535.512229] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 535.512618] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 535.512881] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 535.512984] env[59382]: DEBUG nova.virt.hardware [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 535.513838] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aedb3006-efda-4b0d-a5eb-441854aeb431 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 535.522920] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff4aa72d-c906-43e3-b896-12e331b12c0f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 535.585940] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Acquiring lock "4d131062-1c01-4c64-bf26-d38bf9da59d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 535.586430] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Lock "4d131062-1c01-4c64-bf26-d38bf9da59d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 535.604656] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 535.648712] env[59382]: DEBUG nova.policy [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5dbe6e65bcf74b82a90b740c9647b57c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7eebf063184a48369400f3da30cf45ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 535.674605] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 535.674738] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 535.676271] env[59382]: INFO nova.compute.claims [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 535.811882] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b22bb57-154d-4604-a30b-88682738cb23 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 535.821941] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93c7f475-ba1d-4efe-af8b-743b487d4bbd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 535.858335] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8291cf19-adfd-4d8d-a757-bb3aa561c4c8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 535.866778] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77e797bb-4e20-4331-a616-a2fb2a12241d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 535.880593] env[59382]: DEBUG nova.compute.provider_tree [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 535.893446] env[59382]: DEBUG nova.scheduler.client.report [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 535.911842] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.237s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 535.913980] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 535.958057] env[59382]: DEBUG nova.compute.utils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 535.962325] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 535.963675] env[59382]: DEBUG nova.network.neutron [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 535.969238] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 536.046764] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 536.071163] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 536.071892] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 536.071892] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 536.071892] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 536.071892] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 536.072117] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 536.072215] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 536.072373] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 536.072536] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 536.072697] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 536.072934] env[59382]: DEBUG nova.virt.hardware [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 536.073733] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84022de1-c486-4443-a66e-98741bad9d36 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 536.082681] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfc61f5f-a6c9-4d72-9dc1-0efac26f78b9 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 536.174832] env[59382]: DEBUG nova.policy [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'db4c47ce16424f2caaebf3a410def05c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c3f0bf166d174ac6b89972f64eea1a92', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 536.656445] env[59382]: DEBUG nova.network.neutron [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Successfully updated port: ad06bca7-13a9-4af5-a4df-2aa66f46989c {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 536.660690] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "dee2197a-8c39-4655-be3e-e20fb72f518a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 536.660806] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "dee2197a-8c39-4655-be3e-e20fb72f518a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 536.673757] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Acquiring lock "refresh_cache-f1ce8104-72de-488b-8a41-379978af0f54" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 536.673908] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Acquired lock "refresh_cache-f1ce8104-72de-488b-8a41-379978af0f54" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 536.674070] env[59382]: DEBUG nova.network.neutron [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 536.681454] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 536.746983] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 536.747721] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 536.748867] env[59382]: INFO nova.compute.claims [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 536.776793] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquiring lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 536.777529] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 536.788754] env[59382]: DEBUG nova.network.neutron [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 536.803399] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 536.887018] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 536.959112] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92657917-90d9-4cf5-9559-b40452a87054 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 536.971986] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-933fa503-f7b2-41b4-97b0-02f75faf7011 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 536.977994] env[59382]: DEBUG nova.network.neutron [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Successfully created port: b7ae7c3a-da81-4cc2-829b-e1305df47423 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 537.012398] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-271f4399-687e-4679-a387-2fa5bad38d15 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.022230] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-506a9db1-6c51-4ba4-a197-0b62120f709b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.028425] env[59382]: DEBUG nova.network.neutron [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Successfully created port: 88406a1c-f3e4-4564-8a4b-0f4b6fb96d59 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 537.040657] env[59382]: DEBUG nova.compute.provider_tree [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 537.059361] env[59382]: DEBUG nova.scheduler.client.report [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 537.076081] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 537.076602] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 537.078984] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.192s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 537.080347] env[59382]: INFO nova.compute.claims [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 537.126088] env[59382]: DEBUG nova.compute.utils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 537.127359] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 537.127519] env[59382]: DEBUG nova.network.neutron [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 537.138457] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 537.234288] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 537.268982] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 537.269209] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 537.269362] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 537.269540] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 537.269688] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 537.269831] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 537.271755] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 537.271755] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 537.271755] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 537.271755] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 537.271755] env[59382]: DEBUG nova.virt.hardware [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 537.271959] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f2139d5-9551-49a0-b082-08cea044cdf8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.285323] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d432a2db-cd3b-483c-98c8-fa0530fd9cfa {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.290910] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5355b825-a2f8-4a12-bb89-3adbff6ee5e6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.312546] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19b2a043-0e26-4cd6-808a-6f051e7d1623 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.345819] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec7482bc-8364-4cd0-8c3b-072c3bcc7340 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.353762] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e76d191f-348d-4761-9929-1875d4b2cdd3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.367292] env[59382]: DEBUG nova.compute.provider_tree [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 537.370845] env[59382]: DEBUG nova.policy [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9702a8c287ce4b3a9e48669e01398a12', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '28c98dc82546468791584d1f12a9ae5a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 537.379564] env[59382]: DEBUG nova.scheduler.client.report [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 537.397571] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.319s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 537.398060] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 537.434881] env[59382]: DEBUG nova.compute.utils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 537.436148] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 537.436319] env[59382]: DEBUG nova.network.neutron [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 537.444828] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 537.523255] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 537.548311] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 537.548638] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 537.548709] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 537.548866] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 537.549072] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 537.549251] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 537.549467] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 537.549792] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 537.549792] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 537.549939] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 537.550123] env[59382]: DEBUG nova.virt.hardware [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 537.550980] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-487737c8-e6ca-47b0-82ff-11f501f59faa {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.559690] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d72f9d4e-3557-407f-a57a-8ccda0a7d438 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.632495] env[59382]: DEBUG nova.network.neutron [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Updating instance_info_cache with network_info: [{"id": "ad06bca7-13a9-4af5-a4df-2aa66f46989c", "address": "fa:16:3e:4b:a0:41", "network": {"id": "1eedc249-ed1d-404e-ae1d-de25a8da4181", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-913496833-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9048236f0e6743cf87c0ffa5addd17db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaba65c3-6925-4c7f-83b6-17cd1a328e27", "external-id": "nsx-vlan-transportzone-202", "segmentation_id": 202, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad06bca7-13", "ovs_interfaceid": "ad06bca7-13a9-4af5-a4df-2aa66f46989c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 537.648838] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Releasing lock "refresh_cache-f1ce8104-72de-488b-8a41-379978af0f54" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 537.649200] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Instance network_info: |[{"id": "ad06bca7-13a9-4af5-a4df-2aa66f46989c", "address": "fa:16:3e:4b:a0:41", "network": {"id": "1eedc249-ed1d-404e-ae1d-de25a8da4181", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-913496833-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9048236f0e6743cf87c0ffa5addd17db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaba65c3-6925-4c7f-83b6-17cd1a328e27", "external-id": "nsx-vlan-transportzone-202", "segmentation_id": 202, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad06bca7-13", "ovs_interfaceid": "ad06bca7-13a9-4af5-a4df-2aa66f46989c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 537.649653] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:4b:a0:41', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'eaba65c3-6925-4c7f-83b6-17cd1a328e27', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ad06bca7-13a9-4af5-a4df-2aa66f46989c', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 537.665533] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 537.666274] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dea71eff-6a5b-4ada-828d-b4c8111ce0c4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.681153] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Created folder: OpenStack in parent group-v4. [ 537.681363] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Creating folder: Project (9048236f0e6743cf87c0ffa5addd17db). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 537.684280] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5d5dabf9-7a20-41e3-83b9-602efceed8d6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.695378] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Created folder: Project (9048236f0e6743cf87c0ffa5addd17db) in parent group-v459741. [ 537.695570] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Creating folder: Instances. Parent ref: group-v459742. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 537.695895] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7ec98561-5367-4af7-8968-0ce809af5c9a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.707062] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Created folder: Instances in parent group-v459742. [ 537.707318] env[59382]: DEBUG oslo.service.loopingcall [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 537.707499] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 537.707687] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e1f5fd8e-3a25-40b4-960d-91e016d60695 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.728247] env[59382]: DEBUG nova.policy [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56493d62d06948b1a8f2e71130e5f43a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'be1223544d1e41d58240d204229800e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 537.735045] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 537.735045] env[59382]: value = "task-2256668" [ 537.735045] env[59382]: _type = "Task" [ 537.735045] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 537.742922] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256668, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 537.762647] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 537.781939] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Getting list of instances from cluster (obj){ [ 537.781939] env[59382]: value = "domain-c8" [ 537.781939] env[59382]: _type = "ClusterComputeResource" [ 537.781939] env[59382]: } {{(pid=59382) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 537.783585] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9604011a-7b02-4aab-a11e-991adc86926c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.793619] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Got total of 0 instances {{(pid=59382) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 537.794159] env[59382]: WARNING nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] While synchronizing instance power states, found 6 instances in the database and 0 instances on the hypervisor. [ 537.794159] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Triggering sync for uuid f1ce8104-72de-488b-8a41-379978af0f54 {{(pid=59382) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10268}} [ 537.794284] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Triggering sync for uuid 4c529a26-0160-441b-b46c-7e794b079249 {{(pid=59382) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10268}} [ 537.794418] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Triggering sync for uuid f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0 {{(pid=59382) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10268}} [ 537.794578] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Triggering sync for uuid 4d131062-1c01-4c64-bf26-d38bf9da59d6 {{(pid=59382) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10268}} [ 537.794775] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Triggering sync for uuid dee2197a-8c39-4655-be3e-e20fb72f518a {{(pid=59382) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10268}} [ 537.794962] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Triggering sync for uuid ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c {{(pid=59382) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10268}} [ 537.795323] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "f1ce8104-72de-488b-8a41-379978af0f54" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.795623] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "4c529a26-0160-441b-b46c-7e794b079249" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.795823] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.796063] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "4d131062-1c01-4c64-bf26-d38bf9da59d6" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.796296] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "dee2197a-8c39-4655-be3e-e20fb72f518a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.796515] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.796736] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 537.797126] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Getting list of instances from cluster (obj){ [ 537.797126] env[59382]: value = "domain-c8" [ 537.797126] env[59382]: _type = "ClusterComputeResource" [ 537.797126] env[59382]: } {{(pid=59382) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 537.800864] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82e68a4f-01fb-4365-8ace-b8a9d3cb4b88 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.812327] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Got total of 0 instances {{(pid=59382) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 537.934483] env[59382]: DEBUG nova.network.neutron [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Successfully updated port: 5ebb8bf4-ced4-4546-88a7-a13820f0ffb7 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 537.956342] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Acquiring lock "refresh_cache-4c529a26-0160-441b-b46c-7e794b079249" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 537.956491] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Acquired lock "refresh_cache-4c529a26-0160-441b-b46c-7e794b079249" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 537.956643] env[59382]: DEBUG nova.network.neutron [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 538.058105] env[59382]: DEBUG nova.network.neutron [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 538.186975] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "4075452d-d1ef-4fb7-8fa1-50ef80998151" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 538.187204] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "4075452d-d1ef-4fb7-8fa1-50ef80998151" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 538.198322] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 538.256753] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256668, 'name': CreateVM_Task, 'duration_secs': 0.312051} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 538.257147] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 538.271470] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 538.271718] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 538.276321] env[59382]: INFO nova.compute.claims [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 538.353114] env[59382]: DEBUG oslo_vmware.service [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20e3fbe0-1ff3-4afc-9ab3-4920a79a3aaf {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.360660] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 538.360660] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 538.363650] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 538.363650] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3a388f85-7302-4f0f-b80a-8c32f5584072 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.370873] env[59382]: DEBUG oslo_vmware.api [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Waiting for the task: (returnval){ [ 538.370873] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52417125-063d-e0cd-8917-be45266c4472" [ 538.370873] env[59382]: _type = "Task" [ 538.370873] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 538.385968] env[59382]: DEBUG oslo_vmware.api [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52417125-063d-e0cd-8917-be45266c4472, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 538.467031] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aba9889-c6a2-46f9-95b3-ba25ec288cf1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.476713] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a029148a-9338-494f-b2ff-5a154a1be364 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.511416] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94e6e904-e58c-4e8c-87f7-e28feb7e4338 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.519336] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92146a46-4860-4939-891b-08ff36ab68d0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.534207] env[59382]: DEBUG nova.compute.provider_tree [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 538.546582] env[59382]: DEBUG nova.scheduler.client.report [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 538.572325] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 538.572856] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 538.615868] env[59382]: DEBUG nova.compute.utils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 538.618179] env[59382]: DEBUG nova.network.neutron [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Successfully created port: 5fe70167-c7b3-4491-83ca-9408151b47bf {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 538.621417] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 538.621417] env[59382]: DEBUG nova.network.neutron [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 538.634029] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 538.711833] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 538.738697] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 538.739490] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 538.739490] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 538.739490] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 538.739490] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 538.740804] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 538.740804] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 538.740804] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 538.740804] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 538.740804] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 538.740968] env[59382]: DEBUG nova.virt.hardware [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 538.741558] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92f8f26d-5b70-4b8c-8282-369e9095deaf {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.757614] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41f34bd3-dafb-4d12-8a86-ad50c5f1c09c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.838191] env[59382]: DEBUG nova.network.neutron [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Successfully updated port: 88406a1c-f3e4-4564-8a4b-0f4b6fb96d59 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 538.846081] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Acquiring lock "refresh_cache-4d131062-1c01-4c64-bf26-d38bf9da59d6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 538.846245] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Acquired lock "refresh_cache-4d131062-1c01-4c64-bf26-d38bf9da59d6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 538.846396] env[59382]: DEBUG nova.network.neutron [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 538.884839] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 538.885122] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 538.885356] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 538.885498] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 538.885911] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 538.886176] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dc77a7f5-a659-4b68-9413-2c95bfa766ae {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.904185] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 538.904185] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 538.904574] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ce5fd65-8fcd-4155-8b80-0f3661ff8d4b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.911437] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-06ea161b-c929-4f54-a434-4f41b0ccbf34 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.916666] env[59382]: DEBUG oslo_vmware.api [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Waiting for the task: (returnval){ [ 538.916666] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52a1d0c3-9650-f798-1ef8-f9ccc62139f6" [ 538.916666] env[59382]: _type = "Task" [ 538.916666] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 538.933714] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 538.933714] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Creating directory with path [datastore1] vmware_temp/4d24b225-ba83-4fbe-a046-c4523076fc48/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 538.933714] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-801e8c3f-4db0-45a0-a7c3-2109ef10a09e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.953905] env[59382]: DEBUG nova.network.neutron [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 538.963178] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Created directory with path [datastore1] vmware_temp/4d24b225-ba83-4fbe-a046-c4523076fc48/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 538.963178] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Fetch image to [datastore1] vmware_temp/4d24b225-ba83-4fbe-a046-c4523076fc48/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 538.963178] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/4d24b225-ba83-4fbe-a046-c4523076fc48/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 538.963178] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c08d2889-5b69-47d7-b3e9-75168a2239c1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.970590] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4c1e7fe-1cd0-4af4-9991-1af36017f162 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.982380] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33cc30b7-7807-4753-a507-56b4445a9183 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.018626] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dce169d7-c706-4521-89ec-0b31db96bde8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.024823] env[59382]: DEBUG nova.policy [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8949b5c249e47bbba781a4aa0d0f065', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9e07673924d647a9aa97917daba6b838', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 539.028365] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c80aad3a-7750-4c18-8059-b274afe29622 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.067016] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 539.132151] env[59382]: DEBUG oslo_vmware.rw_handles [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4d24b225-ba83-4fbe-a046-c4523076fc48/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 539.206936] env[59382]: DEBUG oslo_vmware.rw_handles [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 539.206936] env[59382]: DEBUG oslo_vmware.rw_handles [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4d24b225-ba83-4fbe-a046-c4523076fc48/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 539.235681] env[59382]: DEBUG nova.network.neutron [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Updating instance_info_cache with network_info: [{"id": "5ebb8bf4-ced4-4546-88a7-a13820f0ffb7", "address": "fa:16:3e:67:03:86", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ebb8bf4-ce", "ovs_interfaceid": "5ebb8bf4-ced4-4546-88a7-a13820f0ffb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 539.255169] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Releasing lock "refresh_cache-4c529a26-0160-441b-b46c-7e794b079249" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 539.255308] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Instance network_info: |[{"id": "5ebb8bf4-ced4-4546-88a7-a13820f0ffb7", "address": "fa:16:3e:67:03:86", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ebb8bf4-ce", "ovs_interfaceid": "5ebb8bf4-ced4-4546-88a7-a13820f0ffb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 539.255673] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:67:03:86', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '97b68ed7-8461-4345-b064-96a1dde53a86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5ebb8bf4-ced4-4546-88a7-a13820f0ffb7', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 539.264934] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Creating folder: Project (add015e8c0cc413eb4a4d92da82522c4). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 539.264934] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0481c60a-d8eb-4148-aa9f-f832037b167b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.275968] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Created folder: Project (add015e8c0cc413eb4a4d92da82522c4) in parent group-v459741. [ 539.276177] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Creating folder: Instances. Parent ref: group-v459745. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 539.276409] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6ba97f67-5ef0-44e9-98d1-ab1cda0401d8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.286425] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Created folder: Instances in parent group-v459745. [ 539.286793] env[59382]: DEBUG oslo.service.loopingcall [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 539.286896] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 539.287046] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bb69a259-a4b9-4ae6-90c1-8cc65dad97d4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.312558] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 539.312558] env[59382]: value = "task-2256671" [ 539.312558] env[59382]: _type = "Task" [ 539.312558] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 539.323726] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256671, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 539.380023] env[59382]: DEBUG nova.compute.manager [req-dfe48057-00ee-4d5a-b326-dfb47802d4be req-8fe6f90b-dc27-429b-9b1b-e2d326377d16 service nova] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Received event network-vif-plugged-ad06bca7-13a9-4af5-a4df-2aa66f46989c {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 539.380266] env[59382]: DEBUG oslo_concurrency.lockutils [req-dfe48057-00ee-4d5a-b326-dfb47802d4be req-8fe6f90b-dc27-429b-9b1b-e2d326377d16 service nova] Acquiring lock "f1ce8104-72de-488b-8a41-379978af0f54-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 539.380469] env[59382]: DEBUG oslo_concurrency.lockutils [req-dfe48057-00ee-4d5a-b326-dfb47802d4be req-8fe6f90b-dc27-429b-9b1b-e2d326377d16 service nova] Lock "f1ce8104-72de-488b-8a41-379978af0f54-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 539.380648] env[59382]: DEBUG oslo_concurrency.lockutils [req-dfe48057-00ee-4d5a-b326-dfb47802d4be req-8fe6f90b-dc27-429b-9b1b-e2d326377d16 service nova] Lock "f1ce8104-72de-488b-8a41-379978af0f54-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 539.380845] env[59382]: DEBUG nova.compute.manager [req-dfe48057-00ee-4d5a-b326-dfb47802d4be req-8fe6f90b-dc27-429b-9b1b-e2d326377d16 service nova] [instance: f1ce8104-72de-488b-8a41-379978af0f54] No waiting events found dispatching network-vif-plugged-ad06bca7-13a9-4af5-a4df-2aa66f46989c {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 539.381071] env[59382]: WARNING nova.compute.manager [req-dfe48057-00ee-4d5a-b326-dfb47802d4be req-8fe6f90b-dc27-429b-9b1b-e2d326377d16 service nova] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Received unexpected event network-vif-plugged-ad06bca7-13a9-4af5-a4df-2aa66f46989c for instance with vm_state building and task_state spawning. [ 539.473028] env[59382]: DEBUG nova.network.neutron [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Successfully created port: e086370f-bda5-46e8-90c0-be1605e95fe4 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 539.550316] env[59382]: DEBUG nova.network.neutron [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Updating instance_info_cache with network_info: [{"id": "88406a1c-f3e4-4564-8a4b-0f4b6fb96d59", "address": "fa:16:3e:c6:69:91", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap88406a1c-f3", "ovs_interfaceid": "88406a1c-f3e4-4564-8a4b-0f4b6fb96d59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 539.566156] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Releasing lock "refresh_cache-4d131062-1c01-4c64-bf26-d38bf9da59d6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 539.566471] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Instance network_info: |[{"id": "88406a1c-f3e4-4564-8a4b-0f4b6fb96d59", "address": "fa:16:3e:c6:69:91", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap88406a1c-f3", "ovs_interfaceid": "88406a1c-f3e4-4564-8a4b-0f4b6fb96d59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 539.566844] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c6:69:91', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '97b68ed7-8461-4345-b064-96a1dde53a86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '88406a1c-f3e4-4564-8a4b-0f4b6fb96d59', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 539.580130] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Creating folder: Project (c3f0bf166d174ac6b89972f64eea1a92). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 539.580461] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fa893c5e-e364-4936-8798-2c4805e755b5 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.594345] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Created folder: Project (c3f0bf166d174ac6b89972f64eea1a92) in parent group-v459741. [ 539.594345] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Creating folder: Instances. Parent ref: group-v459748. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 539.594345] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9b70c23a-95fa-4d2a-8c63-6dfdc088182c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.605389] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Created folder: Instances in parent group-v459748. [ 539.605617] env[59382]: DEBUG oslo.service.loopingcall [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 539.605849] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 539.606078] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-99de8f2f-094a-4403-bdc3-493af3bc6306 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.626784] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 539.626784] env[59382]: value = "task-2256674" [ 539.626784] env[59382]: _type = "Task" [ 539.626784] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 539.635377] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256674, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 539.689016] env[59382]: DEBUG nova.network.neutron [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Successfully updated port: b7ae7c3a-da81-4cc2-829b-e1305df47423 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 539.702884] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquiring lock "refresh_cache-f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 539.703039] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquired lock "refresh_cache-f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 539.703193] env[59382]: DEBUG nova.network.neutron [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 539.824033] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256671, 'name': CreateVM_Task, 'duration_secs': 0.327921} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 539.824245] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 539.825260] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 539.825260] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 539.825442] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 539.825724] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2e43e4ed-4962-4f4a-a4d3-e3c8de3aa96e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 539.831021] env[59382]: DEBUG oslo_vmware.api [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Waiting for the task: (returnval){ [ 539.831021] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5216928b-2f4b-eb8d-3a02-b6d0a1728454" [ 539.831021] env[59382]: _type = "Task" [ 539.831021] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 539.842302] env[59382]: DEBUG oslo_vmware.api [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5216928b-2f4b-eb8d-3a02-b6d0a1728454, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 539.876578] env[59382]: DEBUG nova.network.neutron [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 540.019684] env[59382]: DEBUG nova.network.neutron [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Successfully updated port: 5fe70167-c7b3-4491-83ca-9408151b47bf {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 540.032392] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "refresh_cache-dee2197a-8c39-4655-be3e-e20fb72f518a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 540.032392] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquired lock "refresh_cache-dee2197a-8c39-4655-be3e-e20fb72f518a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 540.032392] env[59382]: DEBUG nova.network.neutron [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 540.072952] env[59382]: DEBUG nova.network.neutron [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 540.141451] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256674, 'name': CreateVM_Task, 'duration_secs': 0.298159} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 540.141632] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 540.142498] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 540.262668] env[59382]: DEBUG nova.network.neutron [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Updating instance_info_cache with network_info: [{"id": "5fe70167-c7b3-4491-83ca-9408151b47bf", "address": "fa:16:3e:04:d3:5c", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.106", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5fe70167-c7", "ovs_interfaceid": "5fe70167-c7b3-4491-83ca-9408151b47bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 540.287453] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Releasing lock "refresh_cache-dee2197a-8c39-4655-be3e-e20fb72f518a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 540.287914] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Instance network_info: |[{"id": "5fe70167-c7b3-4491-83ca-9408151b47bf", "address": "fa:16:3e:04:d3:5c", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.106", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5fe70167-c7", "ovs_interfaceid": "5fe70167-c7b3-4491-83ca-9408151b47bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 540.292671] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:04:d3:5c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '97b68ed7-8461-4345-b064-96a1dde53a86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5fe70167-c7b3-4491-83ca-9408151b47bf', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 540.302597] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Creating folder: Project (28c98dc82546468791584d1f12a9ae5a). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 540.303613] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-81ed5dde-015b-4c61-8ce3-e2fd5ef8aac9 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 540.317899] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Created folder: Project (28c98dc82546468791584d1f12a9ae5a) in parent group-v459741. [ 540.318272] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Creating folder: Instances. Parent ref: group-v459751. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 540.319057] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d71bc30c-a3ce-415d-a6b6-6e764533eaa3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 540.330132] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Created folder: Instances in parent group-v459751. [ 540.330505] env[59382]: DEBUG oslo.service.loopingcall [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 540.330777] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 540.330978] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1c2ce78a-74fb-4ffb-ba72-af35ff188609 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 540.361460] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 540.361702] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 540.361944] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 540.362201] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 540.362201] env[59382]: value = "task-2256677" [ 540.362201] env[59382]: _type = "Task" [ 540.362201] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 540.362370] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 540.363022] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 540.363022] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d24b2ab1-897a-4e2c-978d-4334f6742f99 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 540.374087] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256677, 'name': CreateVM_Task} progress is 5%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 540.375336] env[59382]: DEBUG oslo_vmware.api [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Waiting for the task: (returnval){ [ 540.375336] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]523159aa-80d1-f1de-ced5-b52eaa2821c4" [ 540.375336] env[59382]: _type = "Task" [ 540.375336] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 540.385413] env[59382]: DEBUG oslo_vmware.api [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]523159aa-80d1-f1de-ced5-b52eaa2821c4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 540.747722] env[59382]: DEBUG nova.network.neutron [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Successfully created port: 56ced3eb-6deb-43f2-b0c2-d3e238e3206c {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 540.854237] env[59382]: DEBUG nova.network.neutron [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Updating instance_info_cache with network_info: [{"id": "b7ae7c3a-da81-4cc2-829b-e1305df47423", "address": "fa:16:3e:49:35:82", "network": {"id": "9f8268fb-191e-4608-a4f7-7b7169b302b9", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1778494407-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7eebf063184a48369400f3da30cf45ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6edb8eae-1113-49d0-84f7-9fd9f82b26fb", "external-id": "nsx-vlan-transportzone-493", "segmentation_id": 493, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb7ae7c3a-da", "ovs_interfaceid": "b7ae7c3a-da81-4cc2-829b-e1305df47423", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 540.872848] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256677, 'name': CreateVM_Task, 'duration_secs': 0.340361} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 540.873642] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 540.874062] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Releasing lock "refresh_cache-f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 540.874329] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Instance network_info: |[{"id": "b7ae7c3a-da81-4cc2-829b-e1305df47423", "address": "fa:16:3e:49:35:82", "network": {"id": "9f8268fb-191e-4608-a4f7-7b7169b302b9", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1778494407-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7eebf063184a48369400f3da30cf45ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6edb8eae-1113-49d0-84f7-9fd9f82b26fb", "external-id": "nsx-vlan-transportzone-493", "segmentation_id": 493, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb7ae7c3a-da", "ovs_interfaceid": "b7ae7c3a-da81-4cc2-829b-e1305df47423", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 540.875056] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 540.875982] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:49:35:82', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6edb8eae-1113-49d0-84f7-9fd9f82b26fb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b7ae7c3a-da81-4cc2-829b-e1305df47423', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 540.883507] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Creating folder: Project (7eebf063184a48369400f3da30cf45ac). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 540.884071] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a2be97af-de68-425f-a31d-6fce7b8f7feb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 540.894663] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 540.894890] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 540.895192] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 540.895300] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 540.895585] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 540.895831] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4c583a74-56c0-4d91-94d4-715bc106bfdb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 540.898708] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Created folder: Project (7eebf063184a48369400f3da30cf45ac) in parent group-v459741. [ 540.898886] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Creating folder: Instances. Parent ref: group-v459754. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 540.899414] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d36a5735-7cce-4f2c-b9f0-8f6a07a77374 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 540.902501] env[59382]: DEBUG oslo_vmware.api [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Waiting for the task: (returnval){ [ 540.902501] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52bf0fbd-1601-6d60-406e-40c1ee689f7d" [ 540.902501] env[59382]: _type = "Task" [ 540.902501] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 540.910751] env[59382]: DEBUG oslo_vmware.api [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52bf0fbd-1601-6d60-406e-40c1ee689f7d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 540.912567] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Created folder: Instances in parent group-v459754. [ 540.912820] env[59382]: DEBUG oslo.service.loopingcall [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 540.913011] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 540.913209] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-702ee306-892b-4f7e-985b-9389d7b670ac {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 540.934279] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 540.934279] env[59382]: value = "task-2256680" [ 540.934279] env[59382]: _type = "Task" [ 540.934279] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 540.939269] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256680, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 541.414889] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 541.415172] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 541.415522] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 541.447139] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256680, 'name': CreateVM_Task, 'duration_secs': 0.356604} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 541.447139] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 541.448965] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 541.449110] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 541.449421] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 541.449676] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-be6b6f00-e72b-4282-9e89-36ba5425b7b2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.456658] env[59382]: DEBUG oslo_vmware.api [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Waiting for the task: (returnval){ [ 541.456658] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52ac1588-f56c-da6e-8bd2-3fba600ea27a" [ 541.456658] env[59382]: _type = "Task" [ 541.456658] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 541.467021] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquiring lock "6feee415-28ca-42b4-bd0a-ea5e531b117c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 541.467021] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Lock "6feee415-28ca-42b4-bd0a-ea5e531b117c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 541.472255] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 541.472758] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 541.473074] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 541.478950] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 541.537058] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 541.537339] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 541.538817] env[59382]: INFO nova.compute.claims [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 541.821907] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a14905ac-262b-47d8-86c6-23a0892700f3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.829896] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65efa5f7-e0bb-4dad-b329-260ea9cecf1c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.864498] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42677201-807d-4207-9c0f-eea8b96b117e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.873189] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09c4f11f-6356-4d49-b831-e13e1680d9a1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 541.887748] env[59382]: DEBUG nova.compute.provider_tree [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 541.903593] env[59382]: DEBUG nova.scheduler.client.report [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 541.921879] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.382s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 541.921879] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 541.932864] env[59382]: DEBUG nova.network.neutron [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Successfully updated port: e086370f-bda5-46e8-90c0-be1605e95fe4 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 541.943208] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquiring lock "refresh_cache-ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 541.943360] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquired lock "refresh_cache-ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 541.943526] env[59382]: DEBUG nova.network.neutron [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 541.969884] env[59382]: DEBUG nova.compute.utils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 541.969884] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 541.969884] env[59382]: DEBUG nova.network.neutron [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 541.977864] env[59382]: DEBUG nova.compute.manager [req-77c44802-b0d6-452c-9cbd-4125b35882df req-687e7a85-f2be-47bf-9be7-363bd666d693 service nova] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Received event network-vif-plugged-88406a1c-f3e4-4564-8a4b-0f4b6fb96d59 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 541.978080] env[59382]: DEBUG oslo_concurrency.lockutils [req-77c44802-b0d6-452c-9cbd-4125b35882df req-687e7a85-f2be-47bf-9be7-363bd666d693 service nova] Acquiring lock "4d131062-1c01-4c64-bf26-d38bf9da59d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 541.978346] env[59382]: DEBUG oslo_concurrency.lockutils [req-77c44802-b0d6-452c-9cbd-4125b35882df req-687e7a85-f2be-47bf-9be7-363bd666d693 service nova] Lock "4d131062-1c01-4c64-bf26-d38bf9da59d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 541.978425] env[59382]: DEBUG oslo_concurrency.lockutils [req-77c44802-b0d6-452c-9cbd-4125b35882df req-687e7a85-f2be-47bf-9be7-363bd666d693 service nova] Lock "4d131062-1c01-4c64-bf26-d38bf9da59d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 541.980156] env[59382]: DEBUG nova.compute.manager [req-77c44802-b0d6-452c-9cbd-4125b35882df req-687e7a85-f2be-47bf-9be7-363bd666d693 service nova] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] No waiting events found dispatching network-vif-plugged-88406a1c-f3e4-4564-8a4b-0f4b6fb96d59 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 541.980156] env[59382]: WARNING nova.compute.manager [req-77c44802-b0d6-452c-9cbd-4125b35882df req-687e7a85-f2be-47bf-9be7-363bd666d693 service nova] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Received unexpected event network-vif-plugged-88406a1c-f3e4-4564-8a4b-0f4b6fb96d59 for instance with vm_state building and task_state spawning. [ 541.996661] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 542.098707] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 542.103379] env[59382]: DEBUG nova.network.neutron [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 542.139943] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 542.140708] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 542.142027] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 542.142027] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 542.142027] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 542.142027] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 542.142027] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 542.142418] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 542.143856] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 542.146019] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 542.146019] env[59382]: DEBUG nova.virt.hardware [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 542.146019] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b569b3c8-f02f-4974-bdf4-3a1c91acedcd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.155454] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02d64b4d-553a-4643-bd5e-8fee10668f8e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.200769] env[59382]: DEBUG nova.policy [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '49a9e2f73ea9418687f3c5367d8409df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a44ac353f1a7469b88b361b52174882d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 542.542357] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 542.542926] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 542.543015] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Starting heal instance info cache {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9858}} [ 542.543085] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Rebuilding the list of instances to heal {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9862}} [ 542.570746] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 542.570890] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 542.571040] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 542.571174] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 542.571302] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 542.571423] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 542.571545] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 542.572127] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 542.572186] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Didn't find any instances for network info cache update. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9944}} [ 542.572761] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 542.573033] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 542.573235] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 542.573426] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 542.573616] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 542.573803] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 542.574202] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59382) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10477}} [ 542.574368] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 542.585160] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 542.585160] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 542.585160] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 542.586049] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 542.586406] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a7f235c-79e9-4a15-94c9-4b5d20b896f2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.597984] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2751a246-53c9-4875-9801-7af70efc37c1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.615422] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11e17174-b29d-4881-80eb-75b9d43123a2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.623197] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-337acee8-4771-47a4-a612-9b0ae12abb60 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.655954] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181252MB free_disk=170GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 542.656134] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 542.656336] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 542.756533] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance f1ce8104-72de-488b-8a41-379978af0f54 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 542.756819] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4c529a26-0160-441b-b46c-7e794b079249 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 542.757014] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 542.757248] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance dee2197a-8c39-4655-be3e-e20fb72f518a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 542.757460] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4d131062-1c01-4c64-bf26-d38bf9da59d6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 542.757639] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 542.757952] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4075452d-d1ef-4fb7-8fa1-50ef80998151 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 542.758143] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 6feee415-28ca-42b4-bd0a-ea5e531b117c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 542.758448] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 542.758678] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 542.893286] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-329b1ab3-e175-4571-a051-554955bdb5d4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.901324] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c50bf816-c3af-46c5-ae1c-b904fb42a1a1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.932997] env[59382]: DEBUG nova.network.neutron [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Updating instance_info_cache with network_info: [{"id": "e086370f-bda5-46e8-90c0-be1605e95fe4", "address": "fa:16:3e:d3:7b:9a", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.247", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape086370f-bd", "ovs_interfaceid": "e086370f-bda5-46e8-90c0-be1605e95fe4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 542.934897] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-043aa802-0ab5-4137-ae0c-c81c2e05875d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.943523] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aebcdbf-e0f6-4e08-bdd4-7468ee177762 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.959701] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 542.961378] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Releasing lock "refresh_cache-ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 542.961668] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Instance network_info: |[{"id": "e086370f-bda5-46e8-90c0-be1605e95fe4", "address": "fa:16:3e:d3:7b:9a", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.247", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape086370f-bd", "ovs_interfaceid": "e086370f-bda5-46e8-90c0-be1605e95fe4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 542.962211] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d3:7b:9a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '97b68ed7-8461-4345-b064-96a1dde53a86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e086370f-bda5-46e8-90c0-be1605e95fe4', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 542.969659] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Creating folder: Project (be1223544d1e41d58240d204229800e3). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 542.970193] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b572eb84-0aa2-496e-beb7-4530bb0cd3fd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.973679] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 542.984629] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Created folder: Project (be1223544d1e41d58240d204229800e3) in parent group-v459741. [ 542.984957] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Creating folder: Instances. Parent ref: group-v459757. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 542.985081] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1f0fdfd0-d8cf-4d9e-a3ed-542473625597 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 542.994339] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Created folder: Instances in parent group-v459757. [ 542.994608] env[59382]: DEBUG oslo.service.loopingcall [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 542.995310] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 542.995934] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 542.996347] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.340s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 542.996567] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dd422d06-9531-4e73-9a1b-ad85b128335a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.016736] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 543.016736] env[59382]: value = "task-2256683" [ 543.016736] env[59382]: _type = "Task" [ 543.016736] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 543.022273] env[59382]: DEBUG nova.network.neutron [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Successfully updated port: 56ced3eb-6deb-43f2-b0c2-d3e238e3206c {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 543.029101] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256683, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 543.030295] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "refresh_cache-4075452d-d1ef-4fb7-8fa1-50ef80998151" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 543.030428] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquired lock "refresh_cache-4075452d-d1ef-4fb7-8fa1-50ef80998151" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 543.030580] env[59382]: DEBUG nova.network.neutron [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 543.144847] env[59382]: DEBUG nova.network.neutron [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 543.291960] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquiring lock "d31427c1-9979-4617-b5a1-43aee722d88d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 543.292094] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Lock "d31427c1-9979-4617-b5a1-43aee722d88d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 543.302867] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 543.390856] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 543.391021] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 543.392890] env[59382]: INFO nova.compute.claims [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 543.424273] env[59382]: DEBUG nova.network.neutron [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Successfully created port: fdae9a51-c14c-449c-a2fd-e7e0e850157f {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 543.534760] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256683, 'name': CreateVM_Task, 'duration_secs': 0.303362} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 543.539787] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 543.545910] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 543.545910] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 543.546253] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 543.547433] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fcb0a958-7839-4144-bb56-46b56b6fd364 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.555212] env[59382]: DEBUG oslo_vmware.api [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Waiting for the task: (returnval){ [ 543.555212] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52e2b994-ccd7-38bb-cfcc-bc17599b2571" [ 543.555212] env[59382]: _type = "Task" [ 543.555212] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 543.563381] env[59382]: DEBUG oslo_vmware.api [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52e2b994-ccd7-38bb-cfcc-bc17599b2571, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 543.640326] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b13dd05-3938-477b-b03f-eb271c86393e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.646950] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-316c2d70-3e4a-407b-b07c-6169055f491b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.682642] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6b4751f-4d75-4125-be0f-e23cd27ae1fd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.691276] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ade5568b-2082-4a3c-9472-43092a3459c4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.705623] env[59382]: DEBUG nova.compute.provider_tree [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 543.715021] env[59382]: DEBUG nova.scheduler.client.report [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 543.730284] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.339s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 543.730775] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 543.753490] env[59382]: DEBUG nova.network.neutron [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Updating instance_info_cache with network_info: [{"id": "56ced3eb-6deb-43f2-b0c2-d3e238e3206c", "address": "fa:16:3e:6a:cb:ff", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap56ced3eb-6d", "ovs_interfaceid": "56ced3eb-6deb-43f2-b0c2-d3e238e3206c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 543.767911] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Releasing lock "refresh_cache-4075452d-d1ef-4fb7-8fa1-50ef80998151" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 543.768243] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Instance network_info: |[{"id": "56ced3eb-6deb-43f2-b0c2-d3e238e3206c", "address": "fa:16:3e:6a:cb:ff", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap56ced3eb-6d", "ovs_interfaceid": "56ced3eb-6deb-43f2-b0c2-d3e238e3206c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 543.768735] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6a:cb:ff', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '97b68ed7-8461-4345-b064-96a1dde53a86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '56ced3eb-6deb-43f2-b0c2-d3e238e3206c', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 543.776124] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Creating folder: Project (9e07673924d647a9aa97917daba6b838). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 543.777567] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-348bd1c3-d50a-4f1a-8acc-091927f36271 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.780395] env[59382]: DEBUG nova.compute.utils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 543.781832] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 543.781985] env[59382]: DEBUG nova.network.neutron [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 543.790151] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 543.798576] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Created folder: Project (9e07673924d647a9aa97917daba6b838) in parent group-v459741. [ 543.798856] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Creating folder: Instances. Parent ref: group-v459760. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 543.799251] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-896b55d4-a4a9-4504-818b-f68db586048f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.809139] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Created folder: Instances in parent group-v459760. [ 543.809381] env[59382]: DEBUG oslo.service.loopingcall [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 543.811413] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 543.811653] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-38fbe0aa-a195-451e-bfc1-c0d9d0a66621 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.836773] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 543.836773] env[59382]: value = "task-2256686" [ 543.836773] env[59382]: _type = "Task" [ 543.836773] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 543.850107] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256686, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 543.867481] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 543.891517] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 543.891791] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 543.891961] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 543.892160] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 543.892319] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 543.892469] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 543.892691] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 543.892851] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 543.893137] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 543.893205] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 543.893382] env[59382]: DEBUG nova.virt.hardware [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 543.894298] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6fda623-01cc-41f5-a846-d77037a041b1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.902498] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d325621-9172-4eb4-b397-d6f73b08aaa9 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 544.067950] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 544.068292] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 544.068880] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 544.224205] env[59382]: DEBUG nova.policy [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4b3c6f0232194e6b9950e8f8bd2d543e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '96497a8820b54190935f68de46154ed0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 544.350691] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256686, 'name': CreateVM_Task, 'duration_secs': 0.297624} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 544.350691] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 544.351055] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 544.351255] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 544.353421] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 544.353421] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ca2005c5-9004-4aa9-8853-234c55a7a3c2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 544.357185] env[59382]: DEBUG oslo_vmware.api [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Waiting for the task: (returnval){ [ 544.357185] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52d9aa9d-2847-ec72-1887-c5d51e73b006" [ 544.357185] env[59382]: _type = "Task" [ 544.357185] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 544.366157] env[59382]: DEBUG oslo_vmware.api [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52d9aa9d-2847-ec72-1887-c5d51e73b006, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 544.370467] env[59382]: DEBUG nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Received event network-changed-ad06bca7-13a9-4af5-a4df-2aa66f46989c {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 544.370467] env[59382]: DEBUG nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Refreshing instance network info cache due to event network-changed-ad06bca7-13a9-4af5-a4df-2aa66f46989c. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 544.370467] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Acquiring lock "refresh_cache-f1ce8104-72de-488b-8a41-379978af0f54" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 544.370467] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Acquired lock "refresh_cache-f1ce8104-72de-488b-8a41-379978af0f54" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 544.370467] env[59382]: DEBUG nova.network.neutron [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Refreshing network info cache for port ad06bca7-13a9-4af5-a4df-2aa66f46989c {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 544.867287] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 544.867681] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 544.867798] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 545.813145] env[59382]: DEBUG nova.network.neutron [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Updated VIF entry in instance network info cache for port ad06bca7-13a9-4af5-a4df-2aa66f46989c. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 545.813145] env[59382]: DEBUG nova.network.neutron [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Updating instance_info_cache with network_info: [{"id": "ad06bca7-13a9-4af5-a4df-2aa66f46989c", "address": "fa:16:3e:4b:a0:41", "network": {"id": "1eedc249-ed1d-404e-ae1d-de25a8da4181", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-913496833-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9048236f0e6743cf87c0ffa5addd17db", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "eaba65c3-6925-4c7f-83b6-17cd1a328e27", "external-id": "nsx-vlan-transportzone-202", "segmentation_id": 202, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad06bca7-13", "ovs_interfaceid": "ad06bca7-13a9-4af5-a4df-2aa66f46989c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 545.821998] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Releasing lock "refresh_cache-f1ce8104-72de-488b-8a41-379978af0f54" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 545.821998] env[59382]: DEBUG nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Received event network-vif-plugged-5ebb8bf4-ced4-4546-88a7-a13820f0ffb7 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 545.821998] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Acquiring lock "4c529a26-0160-441b-b46c-7e794b079249-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 545.821998] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Lock "4c529a26-0160-441b-b46c-7e794b079249-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 545.822278] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Lock "4c529a26-0160-441b-b46c-7e794b079249-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 545.822278] env[59382]: DEBUG nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: 4c529a26-0160-441b-b46c-7e794b079249] No waiting events found dispatching network-vif-plugged-5ebb8bf4-ced4-4546-88a7-a13820f0ffb7 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 545.822278] env[59382]: WARNING nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Received unexpected event network-vif-plugged-5ebb8bf4-ced4-4546-88a7-a13820f0ffb7 for instance with vm_state building and task_state spawning. [ 545.822405] env[59382]: DEBUG nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Received event network-changed-5ebb8bf4-ced4-4546-88a7-a13820f0ffb7 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 545.822576] env[59382]: DEBUG nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Refreshing instance network info cache due to event network-changed-5ebb8bf4-ced4-4546-88a7-a13820f0ffb7. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 545.822761] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Acquiring lock "refresh_cache-4c529a26-0160-441b-b46c-7e794b079249" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 545.823005] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Acquired lock "refresh_cache-4c529a26-0160-441b-b46c-7e794b079249" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 545.823075] env[59382]: DEBUG nova.network.neutron [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Refreshing network info cache for port 5ebb8bf4-ced4-4546-88a7-a13820f0ffb7 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 546.123332] env[59382]: DEBUG nova.network.neutron [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Successfully created port: b5c4ee87-ed68-4cab-896f-21d59bcbf2bf {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 546.535850] env[59382]: DEBUG nova.network.neutron [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Successfully updated port: fdae9a51-c14c-449c-a2fd-e7e0e850157f {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 546.548500] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquiring lock "refresh_cache-6feee415-28ca-42b4-bd0a-ea5e531b117c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 546.548651] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquired lock "refresh_cache-6feee415-28ca-42b4-bd0a-ea5e531b117c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 546.548801] env[59382]: DEBUG nova.network.neutron [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 546.672186] env[59382]: DEBUG nova.network.neutron [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 547.060571] env[59382]: DEBUG nova.compute.manager [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Received event network-changed-88406a1c-f3e4-4564-8a4b-0f4b6fb96d59 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 547.060772] env[59382]: DEBUG nova.compute.manager [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Refreshing instance network info cache due to event network-changed-88406a1c-f3e4-4564-8a4b-0f4b6fb96d59. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 547.060976] env[59382]: DEBUG oslo_concurrency.lockutils [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] Acquiring lock "refresh_cache-4d131062-1c01-4c64-bf26-d38bf9da59d6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 547.061395] env[59382]: DEBUG oslo_concurrency.lockutils [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] Acquired lock "refresh_cache-4d131062-1c01-4c64-bf26-d38bf9da59d6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 547.061584] env[59382]: DEBUG nova.network.neutron [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Refreshing network info cache for port 88406a1c-f3e4-4564-8a4b-0f4b6fb96d59 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 547.209044] env[59382]: DEBUG nova.network.neutron [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Updated VIF entry in instance network info cache for port 5ebb8bf4-ced4-4546-88a7-a13820f0ffb7. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 547.209421] env[59382]: DEBUG nova.network.neutron [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Updating instance_info_cache with network_info: [{"id": "5ebb8bf4-ced4-4546-88a7-a13820f0ffb7", "address": "fa:16:3e:67:03:86", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5ebb8bf4-ce", "ovs_interfaceid": "5ebb8bf4-ced4-4546-88a7-a13820f0ffb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 547.219271] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Releasing lock "refresh_cache-4c529a26-0160-441b-b46c-7e794b079249" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 547.219521] env[59382]: DEBUG nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Received event network-vif-plugged-b7ae7c3a-da81-4cc2-829b-e1305df47423 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 547.219712] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Acquiring lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 547.219909] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 547.220079] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 547.220245] env[59382]: DEBUG nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] No waiting events found dispatching network-vif-plugged-b7ae7c3a-da81-4cc2-829b-e1305df47423 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 547.220409] env[59382]: WARNING nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Received unexpected event network-vif-plugged-b7ae7c3a-da81-4cc2-829b-e1305df47423 for instance with vm_state building and task_state spawning. [ 547.220573] env[59382]: DEBUG nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Received event network-changed-b7ae7c3a-da81-4cc2-829b-e1305df47423 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 547.220725] env[59382]: DEBUG nova.compute.manager [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Refreshing instance network info cache due to event network-changed-b7ae7c3a-da81-4cc2-829b-e1305df47423. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 547.220911] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Acquiring lock "refresh_cache-f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 547.221122] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Acquired lock "refresh_cache-f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 547.221240] env[59382]: DEBUG nova.network.neutron [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Refreshing network info cache for port b7ae7c3a-da81-4cc2-829b-e1305df47423 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 547.635078] env[59382]: DEBUG nova.network.neutron [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Updated VIF entry in instance network info cache for port 88406a1c-f3e4-4564-8a4b-0f4b6fb96d59. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 547.635472] env[59382]: DEBUG nova.network.neutron [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Updating instance_info_cache with network_info: [{"id": "88406a1c-f3e4-4564-8a4b-0f4b6fb96d59", "address": "fa:16:3e:c6:69:91", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap88406a1c-f3", "ovs_interfaceid": "88406a1c-f3e4-4564-8a4b-0f4b6fb96d59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 547.645279] env[59382]: DEBUG oslo_concurrency.lockutils [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] Releasing lock "refresh_cache-4d131062-1c01-4c64-bf26-d38bf9da59d6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 547.645583] env[59382]: DEBUG nova.compute.manager [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Received event network-vif-plugged-5fe70167-c7b3-4491-83ca-9408151b47bf {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 547.645898] env[59382]: DEBUG oslo_concurrency.lockutils [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] Acquiring lock "dee2197a-8c39-4655-be3e-e20fb72f518a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 547.646238] env[59382]: DEBUG oslo_concurrency.lockutils [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] Lock "dee2197a-8c39-4655-be3e-e20fb72f518a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 547.649374] env[59382]: DEBUG oslo_concurrency.lockutils [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] Lock "dee2197a-8c39-4655-be3e-e20fb72f518a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 547.649374] env[59382]: DEBUG nova.compute.manager [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] No waiting events found dispatching network-vif-plugged-5fe70167-c7b3-4491-83ca-9408151b47bf {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 547.649374] env[59382]: WARNING nova.compute.manager [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Received unexpected event network-vif-plugged-5fe70167-c7b3-4491-83ca-9408151b47bf for instance with vm_state building and task_state spawning. [ 547.649374] env[59382]: DEBUG nova.compute.manager [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Received event network-changed-5fe70167-c7b3-4491-83ca-9408151b47bf {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 547.649825] env[59382]: DEBUG nova.compute.manager [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Refreshing instance network info cache due to event network-changed-5fe70167-c7b3-4491-83ca-9408151b47bf. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 547.649825] env[59382]: DEBUG oslo_concurrency.lockutils [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] Acquiring lock "refresh_cache-dee2197a-8c39-4655-be3e-e20fb72f518a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 547.649825] env[59382]: DEBUG oslo_concurrency.lockutils [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] Acquired lock "refresh_cache-dee2197a-8c39-4655-be3e-e20fb72f518a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 547.649825] env[59382]: DEBUG nova.network.neutron [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Refreshing network info cache for port 5fe70167-c7b3-4491-83ca-9408151b47bf {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 547.749301] env[59382]: DEBUG nova.network.neutron [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Updating instance_info_cache with network_info: [{"id": "fdae9a51-c14c-449c-a2fd-e7e0e850157f", "address": "fa:16:3e:27:2a:03", "network": {"id": "0bde2be5-6af5-4a51-8f9c-8f0cbfc94d72", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-346836667-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a44ac353f1a7469b88b361b52174882d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d69a4b11-8d65-435f-94a5-28f74a39a718", "external-id": "cl2-zone-59", "segmentation_id": 59, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfdae9a51-c1", "ovs_interfaceid": "fdae9a51-c14c-449c-a2fd-e7e0e850157f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 547.760301] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Releasing lock "refresh_cache-6feee415-28ca-42b4-bd0a-ea5e531b117c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 547.760786] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Instance network_info: |[{"id": "fdae9a51-c14c-449c-a2fd-e7e0e850157f", "address": "fa:16:3e:27:2a:03", "network": {"id": "0bde2be5-6af5-4a51-8f9c-8f0cbfc94d72", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-346836667-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a44ac353f1a7469b88b361b52174882d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d69a4b11-8d65-435f-94a5-28f74a39a718", "external-id": "cl2-zone-59", "segmentation_id": 59, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfdae9a51-c1", "ovs_interfaceid": "fdae9a51-c14c-449c-a2fd-e7e0e850157f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 547.761122] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:27:2a:03', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd69a4b11-8d65-435f-94a5-28f74a39a718', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fdae9a51-c14c-449c-a2fd-e7e0e850157f', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 547.769840] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Creating folder: Project (a44ac353f1a7469b88b361b52174882d). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 547.770348] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bf04aac1-116d-43a7-b79d-98c743fe8bdd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.782785] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Created folder: Project (a44ac353f1a7469b88b361b52174882d) in parent group-v459741. [ 547.782968] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Creating folder: Instances. Parent ref: group-v459763. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 547.783217] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b039d56e-8058-4cf5-8005-488bdf68770c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.792054] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Created folder: Instances in parent group-v459763. [ 547.792313] env[59382]: DEBUG oslo.service.loopingcall [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 547.792502] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 547.792745] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7532ba7b-49e9-447c-bdfd-2cdfb482d24b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 547.816436] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 547.816436] env[59382]: value = "task-2256689" [ 547.816436] env[59382]: _type = "Task" [ 547.816436] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 547.824197] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256689, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 548.128497] env[59382]: DEBUG nova.network.neutron [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Updated VIF entry in instance network info cache for port 5fe70167-c7b3-4491-83ca-9408151b47bf. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 548.128870] env[59382]: DEBUG nova.network.neutron [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Updating instance_info_cache with network_info: [{"id": "5fe70167-c7b3-4491-83ca-9408151b47bf", "address": "fa:16:3e:04:d3:5c", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.106", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5fe70167-c7", "ovs_interfaceid": "5fe70167-c7b3-4491-83ca-9408151b47bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 548.141578] env[59382]: DEBUG oslo_concurrency.lockutils [req-c27323fc-f978-4d81-bb15-edad998b7b7d req-8626907a-b15d-48a4-8dc5-52151730be15 service nova] Releasing lock "refresh_cache-dee2197a-8c39-4655-be3e-e20fb72f518a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 548.327180] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256689, 'name': CreateVM_Task, 'duration_secs': 0.33043} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 548.327180] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 548.328066] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 548.328363] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 548.328785] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 548.329144] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1cdeb9d5-633d-46d9-9415-d5d8bdaf0183 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 548.334888] env[59382]: DEBUG oslo_vmware.api [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Waiting for the task: (returnval){ [ 548.334888] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52b15612-c86d-0ead-aacd-2cec8ffbf6d8" [ 548.334888] env[59382]: _type = "Task" [ 548.334888] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 548.343074] env[59382]: DEBUG oslo_vmware.api [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52b15612-c86d-0ead-aacd-2cec8ffbf6d8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 548.447376] env[59382]: DEBUG nova.network.neutron [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Updated VIF entry in instance network info cache for port b7ae7c3a-da81-4cc2-829b-e1305df47423. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 548.447376] env[59382]: DEBUG nova.network.neutron [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Updating instance_info_cache with network_info: [{"id": "b7ae7c3a-da81-4cc2-829b-e1305df47423", "address": "fa:16:3e:49:35:82", "network": {"id": "9f8268fb-191e-4608-a4f7-7b7169b302b9", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1778494407-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7eebf063184a48369400f3da30cf45ac", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6edb8eae-1113-49d0-84f7-9fd9f82b26fb", "external-id": "nsx-vlan-transportzone-493", "segmentation_id": 493, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb7ae7c3a-da", "ovs_interfaceid": "b7ae7c3a-da81-4cc2-829b-e1305df47423", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 548.456361] env[59382]: DEBUG oslo_concurrency.lockutils [req-351d86a2-ad68-44a0-b1ff-bd37ece01eb0 req-13fd4c38-aa2d-4299-8a2a-7d931743a833 service nova] Releasing lock "refresh_cache-f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 548.850946] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 548.851271] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 548.851563] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 549.009833] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquiring lock "93e239f1-44f6-4dfa-8634-50b933aaf9bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 549.010724] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Lock "93e239f1-44f6-4dfa-8634-50b933aaf9bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 549.027594] env[59382]: DEBUG nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 549.089849] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 549.090704] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 549.092642] env[59382]: INFO nova.compute.claims [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 549.298385] env[59382]: DEBUG nova.network.neutron [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Successfully updated port: b5c4ee87-ed68-4cab-896f-21d59bcbf2bf {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 549.311954] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquiring lock "refresh_cache-d31427c1-9979-4617-b5a1-43aee722d88d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 549.312181] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquired lock "refresh_cache-d31427c1-9979-4617-b5a1-43aee722d88d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 549.312395] env[59382]: DEBUG nova.network.neutron [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 549.376058] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fed3eac-53c1-43ab-ad18-70cc90d5383e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.384095] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9c4cad9-5792-42f0-ad1a-9b145323cf85 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.416886] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-119d6c10-4b4a-46fd-a8d1-dc5803552b85 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.425073] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6763e1e-fe43-41ba-b944-30989c3ca205 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.438713] env[59382]: DEBUG nova.compute.provider_tree [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 549.440618] env[59382]: DEBUG nova.network.neutron [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 549.450581] env[59382]: DEBUG nova.scheduler.client.report [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 549.471123] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.378s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 549.471123] env[59382]: DEBUG nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 549.504072] env[59382]: DEBUG nova.compute.utils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 549.507858] env[59382]: DEBUG nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Not allocating networking since 'none' was specified. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 549.517614] env[59382]: DEBUG nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 549.597810] env[59382]: DEBUG nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 549.622730] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 549.622981] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 549.623153] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 549.623370] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 549.623570] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 549.623749] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 549.623962] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 549.624139] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 549.624335] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 549.624503] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 549.624677] env[59382]: DEBUG nova.virt.hardware [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 549.625628] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-945d57c8-4069-4f71-9a99-a4b812e957c5 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.633823] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2ba7da3-da80-4e6f-868d-e7fa15c7507e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.648265] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Instance VIF info [] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 549.653597] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Creating folder: Project (bc6b68febc8342b79ba3cfd312f719d6). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 549.656296] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ae885a95-6b07-4175-b9c1-b15888f0652c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.667140] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Created folder: Project (bc6b68febc8342b79ba3cfd312f719d6) in parent group-v459741. [ 549.667241] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Creating folder: Instances. Parent ref: group-v459766. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 549.667466] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-69114dc1-bfa0-4d44-a9d3-0cf6e7f38111 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.679123] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Created folder: Instances in parent group-v459766. [ 549.679382] env[59382]: DEBUG oslo.service.loopingcall [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 549.679590] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 549.679780] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c8c8d558-1570-4b73-b4ae-43a9b1fd598c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 549.696466] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 549.696466] env[59382]: value = "task-2256692" [ 549.696466] env[59382]: _type = "Task" [ 549.696466] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 549.704012] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256692, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 550.066835] env[59382]: DEBUG nova.network.neutron [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Updating instance_info_cache with network_info: [{"id": "b5c4ee87-ed68-4cab-896f-21d59bcbf2bf", "address": "fa:16:3e:de:ba:24", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.89", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5c4ee87-ed", "ovs_interfaceid": "b5c4ee87-ed68-4cab-896f-21d59bcbf2bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 550.079526] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Releasing lock "refresh_cache-d31427c1-9979-4617-b5a1-43aee722d88d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 550.079881] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Instance network_info: |[{"id": "b5c4ee87-ed68-4cab-896f-21d59bcbf2bf", "address": "fa:16:3e:de:ba:24", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.89", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5c4ee87-ed", "ovs_interfaceid": "b5c4ee87-ed68-4cab-896f-21d59bcbf2bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 550.080212] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:de:ba:24', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '97b68ed7-8461-4345-b064-96a1dde53a86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b5c4ee87-ed68-4cab-896f-21d59bcbf2bf', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 550.088193] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Creating folder: Project (96497a8820b54190935f68de46154ed0). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 550.088730] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-00bd7e7e-dfb4-440b-b8ce-3cb1fbbedfa8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 550.101179] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Created folder: Project (96497a8820b54190935f68de46154ed0) in parent group-v459741. [ 550.101179] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Creating folder: Instances. Parent ref: group-v459769. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 550.101179] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-560b139f-e73a-46a9-bc9f-6d9c04ee5318 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 550.110756] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Created folder: Instances in parent group-v459769. [ 550.110977] env[59382]: DEBUG oslo.service.loopingcall [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 550.111185] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 550.111380] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bce80293-874f-4d86-b067-e826ff77e20f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 550.132023] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 550.132023] env[59382]: value = "task-2256695" [ 550.132023] env[59382]: _type = "Task" [ 550.132023] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 550.138588] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256695, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 550.158745] env[59382]: DEBUG nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Received event network-vif-plugged-e086370f-bda5-46e8-90c0-be1605e95fe4 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 550.158912] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Acquiring lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 550.159301] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 550.159301] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 550.159398] env[59382]: DEBUG nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] No waiting events found dispatching network-vif-plugged-e086370f-bda5-46e8-90c0-be1605e95fe4 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 550.159548] env[59382]: WARNING nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Received unexpected event network-vif-plugged-e086370f-bda5-46e8-90c0-be1605e95fe4 for instance with vm_state building and task_state spawning. [ 550.159703] env[59382]: DEBUG nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Received event network-changed-e086370f-bda5-46e8-90c0-be1605e95fe4 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 550.159849] env[59382]: DEBUG nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Refreshing instance network info cache due to event network-changed-e086370f-bda5-46e8-90c0-be1605e95fe4. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 550.160036] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Acquiring lock "refresh_cache-ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 550.160682] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Acquired lock "refresh_cache-ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 550.160682] env[59382]: DEBUG nova.network.neutron [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Refreshing network info cache for port e086370f-bda5-46e8-90c0-be1605e95fe4 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 550.206630] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256692, 'name': CreateVM_Task, 'duration_secs': 0.241622} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 550.206801] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 550.207259] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 550.207419] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 550.207746] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 550.208000] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-efe26ea1-43df-46c6-a587-40afdf333379 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 550.213318] env[59382]: DEBUG oslo_vmware.api [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Waiting for the task: (returnval){ [ 550.213318] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52a6698c-fbad-9d3e-9331-2379c5b7e353" [ 550.213318] env[59382]: _type = "Task" [ 550.213318] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 550.221494] env[59382]: DEBUG oslo_vmware.api [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52a6698c-fbad-9d3e-9331-2379c5b7e353, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 550.641239] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256695, 'name': CreateVM_Task, 'duration_secs': 0.308138} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 550.641574] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 550.642151] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 550.722378] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 550.722729] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 550.722813] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 550.723030] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 550.723319] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 550.723548] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1f70ce49-e837-4d08-a4e6-b80f290f6772 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 550.727960] env[59382]: DEBUG oslo_vmware.api [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Waiting for the task: (returnval){ [ 550.727960] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5256d9d1-efa8-263a-2512-c0ae4bb58516" [ 550.727960] env[59382]: _type = "Task" [ 550.727960] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 550.735896] env[59382]: DEBUG oslo_vmware.api [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5256d9d1-efa8-263a-2512-c0ae4bb58516, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 550.809630] env[59382]: DEBUG nova.network.neutron [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Updated VIF entry in instance network info cache for port e086370f-bda5-46e8-90c0-be1605e95fe4. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 550.810082] env[59382]: DEBUG nova.network.neutron [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Updating instance_info_cache with network_info: [{"id": "e086370f-bda5-46e8-90c0-be1605e95fe4", "address": "fa:16:3e:d3:7b:9a", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.247", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape086370f-bd", "ovs_interfaceid": "e086370f-bda5-46e8-90c0-be1605e95fe4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 550.821199] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Releasing lock "refresh_cache-ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 550.821441] env[59382]: DEBUG nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Received event network-vif-plugged-56ced3eb-6deb-43f2-b0c2-d3e238e3206c {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 550.821625] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Acquiring lock "4075452d-d1ef-4fb7-8fa1-50ef80998151-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 550.821817] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Lock "4075452d-d1ef-4fb7-8fa1-50ef80998151-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 550.822018] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Lock "4075452d-d1ef-4fb7-8fa1-50ef80998151-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 550.822137] env[59382]: DEBUG nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] No waiting events found dispatching network-vif-plugged-56ced3eb-6deb-43f2-b0c2-d3e238e3206c {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 550.822318] env[59382]: WARNING nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Received unexpected event network-vif-plugged-56ced3eb-6deb-43f2-b0c2-d3e238e3206c for instance with vm_state building and task_state spawning. [ 550.822479] env[59382]: DEBUG nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Received event network-changed-56ced3eb-6deb-43f2-b0c2-d3e238e3206c {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 550.822631] env[59382]: DEBUG nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Refreshing instance network info cache due to event network-changed-56ced3eb-6deb-43f2-b0c2-d3e238e3206c. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 550.822812] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Acquiring lock "refresh_cache-4075452d-d1ef-4fb7-8fa1-50ef80998151" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 550.822945] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Acquired lock "refresh_cache-4075452d-d1ef-4fb7-8fa1-50ef80998151" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 550.827019] env[59382]: DEBUG nova.network.neutron [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Refreshing network info cache for port 56ced3eb-6deb-43f2-b0c2-d3e238e3206c {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 551.240089] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 551.240089] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 551.240089] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 551.544064] env[59382]: DEBUG nova.network.neutron [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Updated VIF entry in instance network info cache for port 56ced3eb-6deb-43f2-b0c2-d3e238e3206c. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 551.544945] env[59382]: DEBUG nova.network.neutron [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Updating instance_info_cache with network_info: [{"id": "56ced3eb-6deb-43f2-b0c2-d3e238e3206c", "address": "fa:16:3e:6a:cb:ff", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap56ced3eb-6d", "ovs_interfaceid": "56ced3eb-6deb-43f2-b0c2-d3e238e3206c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 551.555080] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Releasing lock "refresh_cache-4075452d-d1ef-4fb7-8fa1-50ef80998151" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 551.555516] env[59382]: DEBUG nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Received event network-vif-plugged-fdae9a51-c14c-449c-a2fd-e7e0e850157f {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 551.555852] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Acquiring lock "6feee415-28ca-42b4-bd0a-ea5e531b117c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 551.557111] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Lock "6feee415-28ca-42b4-bd0a-ea5e531b117c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 551.557111] env[59382]: DEBUG oslo_concurrency.lockutils [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] Lock "6feee415-28ca-42b4-bd0a-ea5e531b117c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 551.557111] env[59382]: DEBUG nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] No waiting events found dispatching network-vif-plugged-fdae9a51-c14c-449c-a2fd-e7e0e850157f {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 551.557111] env[59382]: WARNING nova.compute.manager [req-ca0346a9-2df0-407d-8b76-dd1a4ce581ba req-88b8c44e-87c5-418b-9ab2-df588e4212c7 service nova] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Received unexpected event network-vif-plugged-fdae9a51-c14c-449c-a2fd-e7e0e850157f for instance with vm_state building and task_state spawning. [ 553.678966] env[59382]: DEBUG nova.compute.manager [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Received event network-changed-fdae9a51-c14c-449c-a2fd-e7e0e850157f {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 553.679236] env[59382]: DEBUG nova.compute.manager [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Refreshing instance network info cache due to event network-changed-fdae9a51-c14c-449c-a2fd-e7e0e850157f. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 553.679442] env[59382]: DEBUG oslo_concurrency.lockutils [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] Acquiring lock "refresh_cache-6feee415-28ca-42b4-bd0a-ea5e531b117c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 553.679514] env[59382]: DEBUG oslo_concurrency.lockutils [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] Acquired lock "refresh_cache-6feee415-28ca-42b4-bd0a-ea5e531b117c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 553.679698] env[59382]: DEBUG nova.network.neutron [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Refreshing network info cache for port fdae9a51-c14c-449c-a2fd-e7e0e850157f {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 554.308166] env[59382]: DEBUG nova.network.neutron [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Updated VIF entry in instance network info cache for port fdae9a51-c14c-449c-a2fd-e7e0e850157f. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 554.308166] env[59382]: DEBUG nova.network.neutron [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Updating instance_info_cache with network_info: [{"id": "fdae9a51-c14c-449c-a2fd-e7e0e850157f", "address": "fa:16:3e:27:2a:03", "network": {"id": "0bde2be5-6af5-4a51-8f9c-8f0cbfc94d72", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-346836667-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a44ac353f1a7469b88b361b52174882d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d69a4b11-8d65-435f-94a5-28f74a39a718", "external-id": "cl2-zone-59", "segmentation_id": 59, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfdae9a51-c1", "ovs_interfaceid": "fdae9a51-c14c-449c-a2fd-e7e0e850157f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 554.318768] env[59382]: DEBUG oslo_concurrency.lockutils [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] Releasing lock "refresh_cache-6feee415-28ca-42b4-bd0a-ea5e531b117c" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 554.319027] env[59382]: DEBUG nova.compute.manager [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Received event network-vif-plugged-b5c4ee87-ed68-4cab-896f-21d59bcbf2bf {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 554.319224] env[59382]: DEBUG oslo_concurrency.lockutils [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] Acquiring lock "d31427c1-9979-4617-b5a1-43aee722d88d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.319421] env[59382]: DEBUG oslo_concurrency.lockutils [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] Lock "d31427c1-9979-4617-b5a1-43aee722d88d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.319580] env[59382]: DEBUG oslo_concurrency.lockutils [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] Lock "d31427c1-9979-4617-b5a1-43aee722d88d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 554.319742] env[59382]: DEBUG nova.compute.manager [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] No waiting events found dispatching network-vif-plugged-b5c4ee87-ed68-4cab-896f-21d59bcbf2bf {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 554.319907] env[59382]: WARNING nova.compute.manager [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Received unexpected event network-vif-plugged-b5c4ee87-ed68-4cab-896f-21d59bcbf2bf for instance with vm_state building and task_state spawning. [ 554.320077] env[59382]: DEBUG nova.compute.manager [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Received event network-changed-b5c4ee87-ed68-4cab-896f-21d59bcbf2bf {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 554.320232] env[59382]: DEBUG nova.compute.manager [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Refreshing instance network info cache due to event network-changed-b5c4ee87-ed68-4cab-896f-21d59bcbf2bf. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 554.320412] env[59382]: DEBUG oslo_concurrency.lockutils [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] Acquiring lock "refresh_cache-d31427c1-9979-4617-b5a1-43aee722d88d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 554.320557] env[59382]: DEBUG oslo_concurrency.lockutils [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] Acquired lock "refresh_cache-d31427c1-9979-4617-b5a1-43aee722d88d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 554.320714] env[59382]: DEBUG nova.network.neutron [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Refreshing network info cache for port b5c4ee87-ed68-4cab-896f-21d59bcbf2bf {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 554.826445] env[59382]: DEBUG nova.network.neutron [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Updated VIF entry in instance network info cache for port b5c4ee87-ed68-4cab-896f-21d59bcbf2bf. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 554.826811] env[59382]: DEBUG nova.network.neutron [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Updating instance_info_cache with network_info: [{"id": "b5c4ee87-ed68-4cab-896f-21d59bcbf2bf", "address": "fa:16:3e:de:ba:24", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.89", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb5c4ee87-ed", "ovs_interfaceid": "b5c4ee87-ed68-4cab-896f-21d59bcbf2bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 554.837032] env[59382]: DEBUG oslo_concurrency.lockutils [req-4ed60504-1dcc-435f-9839-af4b444796a0 req-62af2dfd-9c77-458a-8317-56bc14c787c5 service nova] Releasing lock "refresh_cache-d31427c1-9979-4617-b5a1-43aee722d88d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 585.857358] env[59382]: WARNING oslo_vmware.rw_handles [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 585.857358] env[59382]: ERROR oslo_vmware.rw_handles [ 585.858116] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/4d24b225-ba83-4fbe-a046-c4523076fc48/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 585.859542] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 585.859879] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Copying Virtual Disk [datastore1] vmware_temp/4d24b225-ba83-4fbe-a046-c4523076fc48/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/4d24b225-ba83-4fbe-a046-c4523076fc48/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 585.860638] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-34c2c915-3350-492e-b8ca-24dd87b18e99 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.870768] env[59382]: DEBUG oslo_vmware.api [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Waiting for the task: (returnval){ [ 585.870768] env[59382]: value = "task-2256696" [ 585.870768] env[59382]: _type = "Task" [ 585.870768] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 585.879964] env[59382]: DEBUG oslo_vmware.api [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Task: {'id': task-2256696, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 586.381765] env[59382]: DEBUG oslo_vmware.exceptions [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 586.382034] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 586.385333] env[59382]: ERROR nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 586.385333] env[59382]: Faults: ['InvalidArgument'] [ 586.385333] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] Traceback (most recent call last): [ 586.385333] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 586.385333] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] yield resources [ 586.385333] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 586.385333] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] self.driver.spawn(context, instance, image_meta, [ 586.385333] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 586.385333] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] self._vmops.spawn(context, instance, image_meta, injected_files, [ 586.385333] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 586.385333] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] self._fetch_image_if_missing(context, vi) [ 586.385333] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] image_cache(vi, tmp_image_ds_loc) [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] vm_util.copy_virtual_disk( [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] session._wait_for_task(vmdk_copy_task) [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] return self.wait_for_task(task_ref) [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] return evt.wait() [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] result = hub.switch() [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 586.385802] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] return self.greenlet.switch() [ 586.386201] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 586.386201] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] self.f(*self.args, **self.kw) [ 586.386201] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 586.386201] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] raise exceptions.translate_fault(task_info.error) [ 586.386201] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 586.386201] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] Faults: ['InvalidArgument'] [ 586.386201] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] [ 586.386201] env[59382]: INFO nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Terminating instance [ 586.387961] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 586.388178] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 586.388481] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 586.388675] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 586.389451] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cbb14b3-147a-46f0-8d34-ec03f8416cda {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.392887] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-92a1c5c9-655f-4e4c-8584-0831d86023e8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.400453] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 586.401574] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a8e95f76-74dc-4141-80d8-af5e86e71811 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.403605] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 586.403605] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 586.403994] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e083c4e6-3d25-40eb-b19b-b4b1e272da2e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.409949] env[59382]: DEBUG oslo_vmware.api [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Waiting for the task: (returnval){ [ 586.409949] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52adb01e-3c62-b76e-c745-582c8c4ccb10" [ 586.409949] env[59382]: _type = "Task" [ 586.409949] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 586.416985] env[59382]: DEBUG oslo_vmware.api [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52adb01e-3c62-b76e-c745-582c8c4ccb10, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 586.919946] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 586.920399] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Creating directory with path [datastore1] vmware_temp/85d0f8a9-b64c-40c1-861d-e965cd547311/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 586.920513] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-de6023ca-8a62-4e09-aaba-51de70e0f639 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.941641] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Created directory with path [datastore1] vmware_temp/85d0f8a9-b64c-40c1-861d-e965cd547311/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 586.941854] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Fetch image to [datastore1] vmware_temp/85d0f8a9-b64c-40c1-861d-e965cd547311/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 586.942056] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/85d0f8a9-b64c-40c1-861d-e965cd547311/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 586.942938] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8625615b-052c-4067-ae79-4579894d18cb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.951984] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6143109f-1520-49b0-a708-a56073a4a2c1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.961416] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed3f9fdd-183a-4c57-916f-15a533a864d2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.992889] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36b4017c-d1b2-4be8-bbce-783aa08c71c2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.999797] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bb385fee-1e31-4d0a-af63-0a299f09e4e1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.033322] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 587.106360] env[59382]: DEBUG oslo_vmware.rw_handles [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/85d0f8a9-b64c-40c1-861d-e965cd547311/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 587.181918] env[59382]: DEBUG oslo_vmware.rw_handles [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 587.182135] env[59382]: DEBUG oslo_vmware.rw_handles [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/85d0f8a9-b64c-40c1-861d-e965cd547311/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 587.595020] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 587.595020] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 587.595020] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Deleting the datastore file [datastore1] f1ce8104-72de-488b-8a41-379978af0f54 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 587.595020] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0e5d922f-fc22-467c-a753-de5e04c70b67 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.603641] env[59382]: DEBUG oslo_vmware.api [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Waiting for the task: (returnval){ [ 587.603641] env[59382]: value = "task-2256698" [ 587.603641] env[59382]: _type = "Task" [ 587.603641] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 587.617693] env[59382]: DEBUG oslo_vmware.api [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Task: {'id': task-2256698, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 588.115444] env[59382]: DEBUG oslo_vmware.api [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Task: {'id': task-2256698, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.094214} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 588.115721] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 588.115880] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 588.116070] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 588.116758] env[59382]: INFO nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Took 1.73 seconds to destroy the instance on the hypervisor. [ 588.119974] env[59382]: DEBUG nova.compute.claims [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 588.120170] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.120389] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.338137] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efb79a46-ab3b-41a1-a4d9-c101af868786 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.346974] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d7d0757-2be2-42d8-8bb9-652c6f8cfc51 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.380848] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4e19f62-eb40-441c-bbbb-963d2bbacc75 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.389349] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43e1b740-2f5c-49c3-83f8-08182fb3ed5d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.407997] env[59382]: DEBUG nova.compute.provider_tree [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 588.424022] env[59382]: DEBUG nova.scheduler.client.report [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 588.440955] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.318s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.440955] env[59382]: ERROR nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 588.440955] env[59382]: Faults: ['InvalidArgument'] [ 588.440955] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] Traceback (most recent call last): [ 588.440955] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 588.440955] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] self.driver.spawn(context, instance, image_meta, [ 588.440955] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 588.440955] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] self._vmops.spawn(context, instance, image_meta, injected_files, [ 588.440955] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 588.440955] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] self._fetch_image_if_missing(context, vi) [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] image_cache(vi, tmp_image_ds_loc) [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] vm_util.copy_virtual_disk( [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] session._wait_for_task(vmdk_copy_task) [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] return self.wait_for_task(task_ref) [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] return evt.wait() [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] result = hub.switch() [ 588.441497] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 588.441979] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] return self.greenlet.switch() [ 588.441979] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 588.441979] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] self.f(*self.args, **self.kw) [ 588.441979] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 588.441979] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] raise exceptions.translate_fault(task_info.error) [ 588.441979] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 588.441979] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] Faults: ['InvalidArgument'] [ 588.441979] env[59382]: ERROR nova.compute.manager [instance: f1ce8104-72de-488b-8a41-379978af0f54] [ 588.441979] env[59382]: DEBUG nova.compute.utils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 588.446212] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Build of instance f1ce8104-72de-488b-8a41-379978af0f54 was re-scheduled: A specified parameter was not correct: fileType [ 588.446212] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 588.446212] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 588.446212] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 588.446212] env[59382]: DEBUG nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 588.446701] env[59382]: DEBUG nova.network.neutron [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 590.009850] env[59382]: DEBUG nova.network.neutron [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 590.025898] env[59382]: INFO nova.compute.manager [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] [instance: f1ce8104-72de-488b-8a41-379978af0f54] Took 1.58 seconds to deallocate network for instance. [ 590.193096] env[59382]: INFO nova.scheduler.client.report [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Deleted allocations for instance f1ce8104-72de-488b-8a41-379978af0f54 [ 590.215979] env[59382]: DEBUG oslo_concurrency.lockutils [None req-44093290-b9c6-4be2-83f8-710bd35ee062 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604 tempest-FloatingIPsAssociationNegativeTestJSON-1210241604-project-member] Lock "f1ce8104-72de-488b-8a41-379978af0f54" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 58.630s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 590.216269] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "f1ce8104-72de-488b-8a41-379978af0f54" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 52.421s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 590.216449] env[59382]: INFO nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: f1ce8104-72de-488b-8a41-379978af0f54] During sync_power_state the instance has a pending task (spawning). Skip. [ 590.216611] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "f1ce8104-72de-488b-8a41-379978af0f54" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.991174] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 603.020368] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 603.023524] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Starting heal instance info cache {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9858}} [ 603.023687] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Rebuilding the list of instances to heal {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9862}} [ 603.043954] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 603.044137] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 603.044274] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 603.044599] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 603.044599] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 603.044714] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 603.044749] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 603.044870] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 603.044980] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 603.048143] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Didn't find any instances for network info cache update. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9944}} [ 603.048684] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 603.049232] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 603.526800] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 603.527059] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 603.527223] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 603.527371] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 603.527513] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 603.550320] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.550320] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.550320] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 603.550320] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 603.550773] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eed39f60-dd66-4b8f-b789-5f85b3fe6407 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.567609] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5eb1f147-31dc-4ca2-98f1-992a2e7a68f1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.585532] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ac97587-5d3d-4504-bd57-aa595a882be3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.596847] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29037df2-f3f2-4aaf-87e2-97f0635ce1de {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.630657] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181245MB free_disk=170GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 603.630752] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 603.630953] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 603.715304] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4c529a26-0160-441b-b46c-7e794b079249 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 603.715668] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 603.715668] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance dee2197a-8c39-4655-be3e-e20fb72f518a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 603.716037] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4d131062-1c01-4c64-bf26-d38bf9da59d6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 603.716037] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 603.716037] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4075452d-d1ef-4fb7-8fa1-50ef80998151 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 603.716037] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 6feee415-28ca-42b4-bd0a-ea5e531b117c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 603.716182] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance d31427c1-9979-4617-b5a1-43aee722d88d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 603.717592] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 93e239f1-44f6-4dfa-8634-50b933aaf9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 603.717592] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 603.717592] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 603.874515] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a08c3e07-c59d-4861-b953-cdb421114b8a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.882894] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f7303e0-3187-4e7b-b1ff-f38bae9356bf {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.914823] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60aea186-2c10-4f7c-b3b3-ad0de337139a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.924221] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-593d57c2-38f2-4b4e-a334-1ff96cefde02 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.938962] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 603.947735] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 603.962308] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 603.962516] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.332s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 604.962565] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 604.962838] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59382) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10477}} [ 621.927753] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Acquiring lock "c2f5545d-884a-4166-a93b-810ef311c2e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.928054] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Lock "c2f5545d-884a-4166-a93b-810ef311c2e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.941433] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 621.996763] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.997272] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.999232] env[59382]: INFO nova.compute.claims [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 622.209960] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33abe7af-f629-43b9-895e-678282c6c448 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.219278] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d04688c1-647a-4fd6-80dd-17d1af3ec44e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.256812] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-597227b1-67b1-4de1-a10f-4e7c86248224 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.264400] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26ad2e7b-ab70-44be-ab8c-483d4563db1e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.278107] env[59382]: DEBUG nova.compute.provider_tree [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 622.286935] env[59382]: DEBUG nova.scheduler.client.report [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 622.301768] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.304s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.302288] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 622.343238] env[59382]: DEBUG nova.compute.utils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 622.344950] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 622.345534] env[59382]: DEBUG nova.network.neutron [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 622.358687] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 622.430125] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 622.452774] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 622.453038] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 622.453205] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 622.453382] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 622.453804] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 622.453804] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 622.454008] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 622.455061] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 622.455252] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 622.455426] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 622.456064] env[59382]: DEBUG nova.virt.hardware [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 622.456521] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6cbce45-6758-4bd1-a903-f79596d60892 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.466990] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73a9fc93-e0a9-4eab-9372-48b28981f00f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.608610] env[59382]: DEBUG nova.policy [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '031903fe94be49ca85372645e5d0d824', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8c4cad7eaa1d480aa335eda457a6a083', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 624.219056] env[59382]: DEBUG nova.network.neutron [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Successfully created port: be25ccbc-e158-419e-a30d-fc08cdc894e1 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 624.665627] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Acquiring lock "feea4bca-d134-475f-81b9-c8415bacf1f1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 624.665627] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Lock "feea4bca-d134-475f-81b9-c8415bacf1f1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.612937] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Acquiring lock "3c235411-c50f-40b5-a681-ca42b7838506" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.613428] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Lock "3c235411-c50f-40b5-a681-ca42b7838506" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 627.083808] env[59382]: DEBUG nova.network.neutron [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Successfully updated port: be25ccbc-e158-419e-a30d-fc08cdc894e1 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 627.095842] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Acquiring lock "refresh_cache-c2f5545d-884a-4166-a93b-810ef311c2e6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 627.096060] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Acquired lock "refresh_cache-c2f5545d-884a-4166-a93b-810ef311c2e6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 627.096229] env[59382]: DEBUG nova.network.neutron [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.449948] env[59382]: DEBUG nova.network.neutron [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 628.118736] env[59382]: DEBUG nova.network.neutron [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Updating instance_info_cache with network_info: [{"id": "be25ccbc-e158-419e-a30d-fc08cdc894e1", "address": "fa:16:3e:f7:7b:4a", "network": {"id": "0298e811-9aea-40fc-a9bd-a4466da82386", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-523903872-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8c4cad7eaa1d480aa335eda457a6a083", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f41e4aa-0d23-48c4-a359-574abb2e7b9a", "external-id": "nsx-vlan-transportzone-695", "segmentation_id": 695, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbe25ccbc-e1", "ovs_interfaceid": "be25ccbc-e158-419e-a30d-fc08cdc894e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 628.135752] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Releasing lock "refresh_cache-c2f5545d-884a-4166-a93b-810ef311c2e6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 628.136146] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Instance network_info: |[{"id": "be25ccbc-e158-419e-a30d-fc08cdc894e1", "address": "fa:16:3e:f7:7b:4a", "network": {"id": "0298e811-9aea-40fc-a9bd-a4466da82386", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-523903872-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8c4cad7eaa1d480aa335eda457a6a083", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f41e4aa-0d23-48c4-a359-574abb2e7b9a", "external-id": "nsx-vlan-transportzone-695", "segmentation_id": 695, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbe25ccbc-e1", "ovs_interfaceid": "be25ccbc-e158-419e-a30d-fc08cdc894e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 628.136447] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f7:7b:4a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6f41e4aa-0d23-48c4-a359-574abb2e7b9a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'be25ccbc-e158-419e-a30d-fc08cdc894e1', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 628.151268] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Creating folder: Project (8c4cad7eaa1d480aa335eda457a6a083). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 628.151268] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-daade37b-3555-432c-9d49-f2023e28c8cd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.163479] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Created folder: Project (8c4cad7eaa1d480aa335eda457a6a083) in parent group-v459741. [ 628.163479] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Creating folder: Instances. Parent ref: group-v459772. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 628.163730] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-550d2ecd-e91f-4144-8aab-beeb38013f81 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.175591] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Created folder: Instances in parent group-v459772. [ 628.175847] env[59382]: DEBUG oslo.service.loopingcall [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 628.176078] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 628.176294] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9379548d-4fcd-454c-938c-8723d365edb1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.200476] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 628.200476] env[59382]: value = "task-2256701" [ 628.200476] env[59382]: _type = "Task" [ 628.200476] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 628.210378] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256701, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 628.713242] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256701, 'name': CreateVM_Task, 'duration_secs': 0.332049} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 628.713242] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 628.714478] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 628.715020] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 628.715699] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 628.716405] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a926f56f-762b-4d19-a2c8-bcfa89774df3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 628.725108] env[59382]: DEBUG oslo_vmware.api [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Waiting for the task: (returnval){ [ 628.725108] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52b530de-849f-e678-9135-8915cd5566db" [ 628.725108] env[59382]: _type = "Task" [ 628.725108] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 628.731630] env[59382]: DEBUG oslo_vmware.api [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52b530de-849f-e678-9135-8915cd5566db, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 629.237794] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 629.238147] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 629.238289] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 629.245715] env[59382]: DEBUG nova.compute.manager [req-e3b93c90-27e8-435e-9d80-4a1ba0c6f45f req-1c468463-5cd0-45c6-b2ce-95006f086586 service nova] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Received event network-vif-plugged-be25ccbc-e158-419e-a30d-fc08cdc894e1 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 629.245961] env[59382]: DEBUG oslo_concurrency.lockutils [req-e3b93c90-27e8-435e-9d80-4a1ba0c6f45f req-1c468463-5cd0-45c6-b2ce-95006f086586 service nova] Acquiring lock "c2f5545d-884a-4166-a93b-810ef311c2e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 629.246044] env[59382]: DEBUG oslo_concurrency.lockutils [req-e3b93c90-27e8-435e-9d80-4a1ba0c6f45f req-1c468463-5cd0-45c6-b2ce-95006f086586 service nova] Lock "c2f5545d-884a-4166-a93b-810ef311c2e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 629.247329] env[59382]: DEBUG oslo_concurrency.lockutils [req-e3b93c90-27e8-435e-9d80-4a1ba0c6f45f req-1c468463-5cd0-45c6-b2ce-95006f086586 service nova] Lock "c2f5545d-884a-4166-a93b-810ef311c2e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 629.247329] env[59382]: DEBUG nova.compute.manager [req-e3b93c90-27e8-435e-9d80-4a1ba0c6f45f req-1c468463-5cd0-45c6-b2ce-95006f086586 service nova] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] No waiting events found dispatching network-vif-plugged-be25ccbc-e158-419e-a30d-fc08cdc894e1 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 629.247329] env[59382]: WARNING nova.compute.manager [req-e3b93c90-27e8-435e-9d80-4a1ba0c6f45f req-1c468463-5cd0-45c6-b2ce-95006f086586 service nova] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Received unexpected event network-vif-plugged-be25ccbc-e158-419e-a30d-fc08cdc894e1 for instance with vm_state building and task_state spawning. [ 630.943971] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Acquiring lock "acae2ecc-9a00-4356-96d7-a7521ea46f32" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.943971] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Lock "acae2ecc-9a00-4356-96d7-a7521ea46f32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 630.953442] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "cf672665-36c7-4251-a32a-537b9d4c38ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 630.953877] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "cf672665-36c7-4251-a32a-537b9d4c38ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 632.593941] env[59382]: DEBUG nova.compute.manager [req-09806d93-9d31-4578-b158-3fe9e981d5aa req-b3080e7b-9ce4-4756-aaa8-b765925f33ac service nova] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Received event network-changed-be25ccbc-e158-419e-a30d-fc08cdc894e1 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 632.593941] env[59382]: DEBUG nova.compute.manager [req-09806d93-9d31-4578-b158-3fe9e981d5aa req-b3080e7b-9ce4-4756-aaa8-b765925f33ac service nova] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Refreshing instance network info cache due to event network-changed-be25ccbc-e158-419e-a30d-fc08cdc894e1. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 632.593941] env[59382]: DEBUG oslo_concurrency.lockutils [req-09806d93-9d31-4578-b158-3fe9e981d5aa req-b3080e7b-9ce4-4756-aaa8-b765925f33ac service nova] Acquiring lock "refresh_cache-c2f5545d-884a-4166-a93b-810ef311c2e6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 632.593941] env[59382]: DEBUG oslo_concurrency.lockutils [req-09806d93-9d31-4578-b158-3fe9e981d5aa req-b3080e7b-9ce4-4756-aaa8-b765925f33ac service nova] Acquired lock "refresh_cache-c2f5545d-884a-4166-a93b-810ef311c2e6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 632.593941] env[59382]: DEBUG nova.network.neutron [req-09806d93-9d31-4578-b158-3fe9e981d5aa req-b3080e7b-9ce4-4756-aaa8-b765925f33ac service nova] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Refreshing network info cache for port be25ccbc-e158-419e-a30d-fc08cdc894e1 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 632.603065] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 632.603478] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.081405] env[59382]: DEBUG nova.network.neutron [req-09806d93-9d31-4578-b158-3fe9e981d5aa req-b3080e7b-9ce4-4756-aaa8-b765925f33ac service nova] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Updated VIF entry in instance network info cache for port be25ccbc-e158-419e-a30d-fc08cdc894e1. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 633.081666] env[59382]: DEBUG nova.network.neutron [req-09806d93-9d31-4578-b158-3fe9e981d5aa req-b3080e7b-9ce4-4756-aaa8-b765925f33ac service nova] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Updating instance_info_cache with network_info: [{"id": "be25ccbc-e158-419e-a30d-fc08cdc894e1", "address": "fa:16:3e:f7:7b:4a", "network": {"id": "0298e811-9aea-40fc-a9bd-a4466da82386", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-523903872-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8c4cad7eaa1d480aa335eda457a6a083", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6f41e4aa-0d23-48c4-a359-574abb2e7b9a", "external-id": "nsx-vlan-transportzone-695", "segmentation_id": 695, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbe25ccbc-e1", "ovs_interfaceid": "be25ccbc-e158-419e-a30d-fc08cdc894e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.096856] env[59382]: DEBUG oslo_concurrency.lockutils [req-09806d93-9d31-4578-b158-3fe9e981d5aa req-b3080e7b-9ce4-4756-aaa8-b765925f33ac service nova] Releasing lock "refresh_cache-c2f5545d-884a-4166-a93b-810ef311c2e6" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.346246] env[59382]: WARNING oslo_vmware.rw_handles [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 633.346246] env[59382]: ERROR oslo_vmware.rw_handles [ 633.346782] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/85d0f8a9-b64c-40c1-861d-e965cd547311/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 633.351022] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 633.351022] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Copying Virtual Disk [datastore1] vmware_temp/85d0f8a9-b64c-40c1-861d-e965cd547311/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/85d0f8a9-b64c-40c1-861d-e965cd547311/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 633.351022] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-01301ec4-ce7e-4d4c-9484-e6c88d2e9c55 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.362588] env[59382]: DEBUG oslo_vmware.api [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Waiting for the task: (returnval){ [ 633.362588] env[59382]: value = "task-2256702" [ 633.362588] env[59382]: _type = "Task" [ 633.362588] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 633.371631] env[59382]: DEBUG oslo_vmware.api [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Task: {'id': task-2256702, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 633.539719] env[59382]: DEBUG oslo_concurrency.lockutils [None req-dd343cee-b907-404f-9f42-6fa531c5778a tempest-VolumesAssistedSnapshotsTest-800723085 tempest-VolumesAssistedSnapshotsTest-800723085-project-member] Acquiring lock "6ed2d21c-66d8-4549-b58e-0cbdcb518f48" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 633.539966] env[59382]: DEBUG oslo_concurrency.lockutils [None req-dd343cee-b907-404f-9f42-6fa531c5778a tempest-VolumesAssistedSnapshotsTest-800723085 tempest-VolumesAssistedSnapshotsTest-800723085-project-member] Lock "6ed2d21c-66d8-4549-b58e-0cbdcb518f48" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 633.876559] env[59382]: DEBUG oslo_vmware.exceptions [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 633.876826] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.878364] env[59382]: ERROR nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 633.878364] env[59382]: Faults: ['InvalidArgument'] [ 633.878364] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] Traceback (most recent call last): [ 633.878364] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 633.878364] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] yield resources [ 633.878364] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 633.878364] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] self.driver.spawn(context, instance, image_meta, [ 633.878364] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 633.878364] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] self._vmops.spawn(context, instance, image_meta, injected_files, [ 633.878364] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 633.878364] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] self._fetch_image_if_missing(context, vi) [ 633.878364] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] image_cache(vi, tmp_image_ds_loc) [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] vm_util.copy_virtual_disk( [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] session._wait_for_task(vmdk_copy_task) [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] return self.wait_for_task(task_ref) [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] return evt.wait() [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] result = hub.switch() [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 633.878707] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] return self.greenlet.switch() [ 633.879055] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 633.879055] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] self.f(*self.args, **self.kw) [ 633.879055] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 633.879055] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] raise exceptions.translate_fault(task_info.error) [ 633.879055] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 633.879055] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] Faults: ['InvalidArgument'] [ 633.879055] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] [ 633.879055] env[59382]: INFO nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Terminating instance [ 633.880323] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.880534] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 633.881240] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 633.881439] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 633.881669] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aedfef3a-114b-44c4-ac5a-96b40f8ae5a1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.884426] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-877b1397-bb98-4349-a5ae-b306ce101706 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.900904] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 633.901224] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 633.901388] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 633.902083] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5a1b2457-ddd6-41f3-9b5a-dfb103c1bd52 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.903673] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f7904bc1-2745-4ce8-868b-11752cac25ad {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.909929] env[59382]: DEBUG oslo_vmware.api [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Waiting for the task: (returnval){ [ 633.909929] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]523a7f6d-eaa5-9b7d-20b7-44e6e7f26352" [ 633.909929] env[59382]: _type = "Task" [ 633.909929] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 633.920648] env[59382]: DEBUG oslo_vmware.api [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]523a7f6d-eaa5-9b7d-20b7-44e6e7f26352, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 633.984441] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 633.984678] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 633.984904] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Deleting the datastore file [datastore1] 4c529a26-0160-441b-b46c-7e794b079249 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 633.985136] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1a314648-5a86-4cb9-982f-42972e8cfccf {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.991297] env[59382]: DEBUG oslo_vmware.api [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Waiting for the task: (returnval){ [ 633.991297] env[59382]: value = "task-2256704" [ 633.991297] env[59382]: _type = "Task" [ 633.991297] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 634.000864] env[59382]: DEBUG oslo_vmware.api [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Task: {'id': task-2256704, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 634.422225] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 634.422480] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Creating directory with path [datastore1] vmware_temp/cc903673-84d5-4ede-b55b-0a6673dd99a0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 634.422711] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f66bcf37-a1f9-4c5e-9ae3-73b89db5e25d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.437423] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Created directory with path [datastore1] vmware_temp/cc903673-84d5-4ede-b55b-0a6673dd99a0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 634.437633] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Fetch image to [datastore1] vmware_temp/cc903673-84d5-4ede-b55b-0a6673dd99a0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 634.437851] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/cc903673-84d5-4ede-b55b-0a6673dd99a0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 634.438585] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9d61269-e0e6-4602-a6f6-6121db46f6dc {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.446761] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9ff3d77-892a-467a-b94d-1270f5bb1184 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.458762] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d385ed82-1c27-4f63-9b2f-8a8219f9b3fb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.500293] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-785327f1-0e42-4f41-9e59-f3655b074222 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.508800] env[59382]: DEBUG oslo_vmware.api [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Task: {'id': task-2256704, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079965} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 634.510624] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 634.511066] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 634.511344] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 634.511528] env[59382]: INFO nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Took 0.63 seconds to destroy the instance on the hypervisor. [ 634.514063] env[59382]: DEBUG nova.compute.claims [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 634.516615] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 634.516615] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 634.518381] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1883ece4-8ba2-42c2-a7b9-8f21e33f8ab7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.548434] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 634.615593] env[59382]: DEBUG oslo_vmware.rw_handles [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cc903673-84d5-4ede-b55b-0a6673dd99a0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 634.677140] env[59382]: DEBUG oslo_vmware.rw_handles [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 634.677577] env[59382]: DEBUG oslo_vmware.rw_handles [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cc903673-84d5-4ede-b55b-0a6673dd99a0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 634.839095] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff848c3c-a99e-4f42-89a9-f8d8e6c29efe {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.846468] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6778c1f-1055-4e66-a425-c8bf51b2429d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.882057] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5087021a-20d8-4d5f-b1de-b4fdf9752d66 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.886748] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e961817-75c4-4c7a-998b-54ec16c32421 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.900143] env[59382]: DEBUG nova.compute.provider_tree [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 634.908288] env[59382]: DEBUG nova.scheduler.client.report [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 634.933686] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.418s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 634.935155] env[59382]: ERROR nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 634.935155] env[59382]: Faults: ['InvalidArgument'] [ 634.935155] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] Traceback (most recent call last): [ 634.935155] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 634.935155] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] self.driver.spawn(context, instance, image_meta, [ 634.935155] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 634.935155] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] self._vmops.spawn(context, instance, image_meta, injected_files, [ 634.935155] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 634.935155] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] self._fetch_image_if_missing(context, vi) [ 634.935155] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 634.935155] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] image_cache(vi, tmp_image_ds_loc) [ 634.935155] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] vm_util.copy_virtual_disk( [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] session._wait_for_task(vmdk_copy_task) [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] return self.wait_for_task(task_ref) [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] return evt.wait() [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] result = hub.switch() [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] return self.greenlet.switch() [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 634.935441] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] self.f(*self.args, **self.kw) [ 634.935712] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 634.935712] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] raise exceptions.translate_fault(task_info.error) [ 634.935712] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 634.935712] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] Faults: ['InvalidArgument'] [ 634.935712] env[59382]: ERROR nova.compute.manager [instance: 4c529a26-0160-441b-b46c-7e794b079249] [ 634.935712] env[59382]: DEBUG nova.compute.utils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 634.936471] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Build of instance 4c529a26-0160-441b-b46c-7e794b079249 was re-scheduled: A specified parameter was not correct: fileType [ 634.936471] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 634.936827] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 634.936995] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 634.937161] env[59382]: DEBUG nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 634.937323] env[59382]: DEBUG nova.network.neutron [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 635.432910] env[59382]: DEBUG nova.network.neutron [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 635.446400] env[59382]: INFO nova.compute.manager [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] [instance: 4c529a26-0160-441b-b46c-7e794b079249] Took 0.51 seconds to deallocate network for instance. [ 635.551669] env[59382]: INFO nova.scheduler.client.report [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Deleted allocations for instance 4c529a26-0160-441b-b46c-7e794b079249 [ 635.577598] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aab1bb27-5cc3-4179-8c1e-2a62bd2f3fad tempest-ServerDiagnosticsTest-1246687487 tempest-ServerDiagnosticsTest-1246687487-project-member] Lock "4c529a26-0160-441b-b46c-7e794b079249" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 102.152s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.580994] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "4c529a26-0160-441b-b46c-7e794b079249" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 97.785s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.580994] env[59382]: INFO nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4c529a26-0160-441b-b46c-7e794b079249] During sync_power_state the instance has a pending task (spawning). Skip. [ 635.580994] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "4c529a26-0160-441b-b46c-7e794b079249" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.601145] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 635.652622] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.652871] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.654830] env[59382]: INFO nova.compute.claims [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 635.896557] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81ab6fd7-f865-499e-9c27-f8abe8e86f5e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.904906] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19c71aed-11b0-4b2f-a453-449ae0cea9ff {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.936125] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe7410c4-b25f-4327-a40c-df779955dd6e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.943622] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d61db273-688c-4a63-8ddf-4257fba28bc5 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.957943] env[59382]: DEBUG nova.compute.provider_tree [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 635.967522] env[59382]: DEBUG nova.scheduler.client.report [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 635.980453] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 635.980920] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 636.176020] env[59382]: DEBUG nova.compute.utils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 636.177470] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 636.177672] env[59382]: DEBUG nova.network.neutron [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 636.186180] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 636.254954] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 636.258163] env[59382]: DEBUG nova.policy [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '354050e0969546af971752ee9a34fcf2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '01dee33932cb4b16bca7b28b4989593a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 636.279712] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 636.279955] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 636.280124] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 636.280307] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 636.280448] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 636.280590] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 636.280792] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 636.280957] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 636.281161] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 636.281327] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 636.281499] env[59382]: DEBUG nova.virt.hardware [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 636.282384] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f99f687a-a7a9-48f5-98c0-0acc72dbe8bf {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.291437] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1e6169d-c3c6-4583-9266-a94cd1916267 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.425608] env[59382]: DEBUG oslo_concurrency.lockutils [None req-c83e5600-8294-491e-8990-3082774618d8 tempest-ServersTestJSON-2096502770 tempest-ServersTestJSON-2096502770-project-member] Acquiring lock "15f9e508-12f2-42db-b6d1-d9b154b94da3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.425842] env[59382]: DEBUG oslo_concurrency.lockutils [None req-c83e5600-8294-491e-8990-3082774618d8 tempest-ServersTestJSON-2096502770 tempest-ServersTestJSON-2096502770-project-member] Lock "15f9e508-12f2-42db-b6d1-d9b154b94da3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.110492] env[59382]: DEBUG nova.network.neutron [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Successfully created port: 7b623726-f68f-48cf-b049-c64d5bc7aa64 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 637.678824] env[59382]: DEBUG nova.network.neutron [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Successfully created port: 3a320330-cadf-446a-b3cd-56cdc7a3b9ce {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 639.131773] env[59382]: DEBUG oslo_concurrency.lockutils [None req-3229f2a1-10f2-4096-ba43-2e7220625ab7 tempest-ServerShowV247Test-2056769075 tempest-ServerShowV247Test-2056769075-project-member] Acquiring lock "165f09fe-9785-46dd-9016-53fc7838fc14" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 639.132063] env[59382]: DEBUG oslo_concurrency.lockutils [None req-3229f2a1-10f2-4096-ba43-2e7220625ab7 tempest-ServerShowV247Test-2056769075 tempest-ServerShowV247Test-2056769075-project-member] Lock "165f09fe-9785-46dd-9016-53fc7838fc14" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.156530] env[59382]: DEBUG nova.network.neutron [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Successfully updated port: 7b623726-f68f-48cf-b049-c64d5bc7aa64 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 640.141637] env[59382]: DEBUG nova.compute.manager [req-a4003acd-b101-4788-85d6-a68ee02a1b88 req-366ecbd9-cb5d-4ffe-9d44-6e2fc32ffe01 service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Received event network-vif-plugged-7b623726-f68f-48cf-b049-c64d5bc7aa64 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 640.141914] env[59382]: DEBUG oslo_concurrency.lockutils [req-a4003acd-b101-4788-85d6-a68ee02a1b88 req-366ecbd9-cb5d-4ffe-9d44-6e2fc32ffe01 service nova] Acquiring lock "feea4bca-d134-475f-81b9-c8415bacf1f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 640.142078] env[59382]: DEBUG oslo_concurrency.lockutils [req-a4003acd-b101-4788-85d6-a68ee02a1b88 req-366ecbd9-cb5d-4ffe-9d44-6e2fc32ffe01 service nova] Lock "feea4bca-d134-475f-81b9-c8415bacf1f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 640.142341] env[59382]: DEBUG oslo_concurrency.lockutils [req-a4003acd-b101-4788-85d6-a68ee02a1b88 req-366ecbd9-cb5d-4ffe-9d44-6e2fc32ffe01 service nova] Lock "feea4bca-d134-475f-81b9-c8415bacf1f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 640.142408] env[59382]: DEBUG nova.compute.manager [req-a4003acd-b101-4788-85d6-a68ee02a1b88 req-366ecbd9-cb5d-4ffe-9d44-6e2fc32ffe01 service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] No waiting events found dispatching network-vif-plugged-7b623726-f68f-48cf-b049-c64d5bc7aa64 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 640.142629] env[59382]: WARNING nova.compute.manager [req-a4003acd-b101-4788-85d6-a68ee02a1b88 req-366ecbd9-cb5d-4ffe-9d44-6e2fc32ffe01 service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Received unexpected event network-vif-plugged-7b623726-f68f-48cf-b049-c64d5bc7aa64 for instance with vm_state building and task_state spawning. [ 640.645613] env[59382]: DEBUG nova.network.neutron [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Successfully updated port: 3a320330-cadf-446a-b3cd-56cdc7a3b9ce {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 640.658586] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Acquiring lock "refresh_cache-feea4bca-d134-475f-81b9-c8415bacf1f1" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 640.659380] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Acquired lock "refresh_cache-feea4bca-d134-475f-81b9-c8415bacf1f1" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 640.659640] env[59382]: DEBUG nova.network.neutron [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 640.925656] env[59382]: DEBUG nova.network.neutron [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 641.500819] env[59382]: DEBUG oslo_concurrency.lockutils [None req-854a5e8f-afd7-458d-9bd2-e041a21ed60c tempest-ServerShowV247Test-2056769075 tempest-ServerShowV247Test-2056769075-project-member] Acquiring lock "1fd581d1-9f01-4428-9aac-edc1237e0541" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 641.501201] env[59382]: DEBUG oslo_concurrency.lockutils [None req-854a5e8f-afd7-458d-9bd2-e041a21ed60c tempest-ServerShowV247Test-2056769075 tempest-ServerShowV247Test-2056769075-project-member] Lock "1fd581d1-9f01-4428-9aac-edc1237e0541" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 641.896669] env[59382]: DEBUG nova.network.neutron [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Updating instance_info_cache with network_info: [{"id": "7b623726-f68f-48cf-b049-c64d5bc7aa64", "address": "fa:16:3e:96:47:fc", "network": {"id": "131a5057-cb26-4673-9d37-3a07fc981c97", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-186276812", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "01dee33932cb4b16bca7b28b4989593a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e028024-a9c1-4cae-8849-ea770a7ae0e4", "external-id": "nsx-vlan-transportzone-919", "segmentation_id": 919, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b623726-f6", "ovs_interfaceid": "7b623726-f68f-48cf-b049-c64d5bc7aa64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3a320330-cadf-446a-b3cd-56cdc7a3b9ce", "address": "fa:16:3e:58:3f:e1", "network": {"id": "aca0863f-8feb-4a9f-a438-a0f0ebb8da7c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-494902276", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "01dee33932cb4b16bca7b28b4989593a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "392517be-6cb8-4b5b-9a52-449bfe2e16f7", "external-id": "nsx-vlan-transportzone-351", "segmentation_id": 351, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a320330-ca", "ovs_interfaceid": "3a320330-cadf-446a-b3cd-56cdc7a3b9ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 641.918164] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Releasing lock "refresh_cache-feea4bca-d134-475f-81b9-c8415bacf1f1" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 641.918578] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Instance network_info: |[{"id": "7b623726-f68f-48cf-b049-c64d5bc7aa64", "address": "fa:16:3e:96:47:fc", "network": {"id": "131a5057-cb26-4673-9d37-3a07fc981c97", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-186276812", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "01dee33932cb4b16bca7b28b4989593a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e028024-a9c1-4cae-8849-ea770a7ae0e4", "external-id": "nsx-vlan-transportzone-919", "segmentation_id": 919, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b623726-f6", "ovs_interfaceid": "7b623726-f68f-48cf-b049-c64d5bc7aa64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3a320330-cadf-446a-b3cd-56cdc7a3b9ce", "address": "fa:16:3e:58:3f:e1", "network": {"id": "aca0863f-8feb-4a9f-a438-a0f0ebb8da7c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-494902276", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "01dee33932cb4b16bca7b28b4989593a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "392517be-6cb8-4b5b-9a52-449bfe2e16f7", "external-id": "nsx-vlan-transportzone-351", "segmentation_id": 351, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a320330-ca", "ovs_interfaceid": "3a320330-cadf-446a-b3cd-56cdc7a3b9ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 641.923020] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:96:47:fc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8e028024-a9c1-4cae-8849-ea770a7ae0e4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7b623726-f68f-48cf-b049-c64d5bc7aa64', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:58:3f:e1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '392517be-6cb8-4b5b-9a52-449bfe2e16f7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3a320330-cadf-446a-b3cd-56cdc7a3b9ce', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 641.929463] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Creating folder: Project (01dee33932cb4b16bca7b28b4989593a). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 641.931100] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-250f6625-c2c3-4030-a297-9020838c17f7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.943770] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Created folder: Project (01dee33932cb4b16bca7b28b4989593a) in parent group-v459741. [ 641.943770] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Creating folder: Instances. Parent ref: group-v459775. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 641.943770] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d0cb918-4056-4bec-aefa-8db0908dc9da {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.955660] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Created folder: Instances in parent group-v459775. [ 641.955781] env[59382]: DEBUG oslo.service.loopingcall [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 641.955897] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 641.956126] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f4ff3d3e-74b9-4ed0-8c3f-3459c4d6e98e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.979585] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 641.979585] env[59382]: value = "task-2256707" [ 641.979585] env[59382]: _type = "Task" [ 641.979585] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 641.987591] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256707, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 642.490998] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256707, 'name': CreateVM_Task, 'duration_secs': 0.38612} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 642.491348] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 642.492290] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 642.492631] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 642.493495] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 642.493930] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-78d9f4a8-8405-4944-9acc-d68e560ac7ee {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.500287] env[59382]: DEBUG oslo_vmware.api [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Waiting for the task: (returnval){ [ 642.500287] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]528a4e82-8507-6450-3b48-93895b1be603" [ 642.500287] env[59382]: _type = "Task" [ 642.500287] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 642.507070] env[59382]: DEBUG oslo_vmware.api [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]528a4e82-8507-6450-3b48-93895b1be603, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 643.013366] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 643.013366] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 643.013366] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 643.195890] env[59382]: DEBUG oslo_concurrency.lockutils [None req-806bd8dd-2820-4cb5-8aa5-1db0c03b2433 tempest-ServerActionsTestOtherB-1111562412 tempest-ServerActionsTestOtherB-1111562412-project-member] Acquiring lock "b086c8ed-73fb-4083-872f-f2d90b0e640f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 643.195983] env[59382]: DEBUG oslo_concurrency.lockutils [None req-806bd8dd-2820-4cb5-8aa5-1db0c03b2433 tempest-ServerActionsTestOtherB-1111562412 tempest-ServerActionsTestOtherB-1111562412-project-member] Lock "b086c8ed-73fb-4083-872f-f2d90b0e640f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 643.408428] env[59382]: DEBUG nova.compute.manager [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Received event network-changed-7b623726-f68f-48cf-b049-c64d5bc7aa64 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 643.408606] env[59382]: DEBUG nova.compute.manager [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Refreshing instance network info cache due to event network-changed-7b623726-f68f-48cf-b049-c64d5bc7aa64. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 643.408804] env[59382]: DEBUG oslo_concurrency.lockutils [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] Acquiring lock "refresh_cache-feea4bca-d134-475f-81b9-c8415bacf1f1" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 643.408967] env[59382]: DEBUG oslo_concurrency.lockutils [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] Acquired lock "refresh_cache-feea4bca-d134-475f-81b9-c8415bacf1f1" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 643.409160] env[59382]: DEBUG nova.network.neutron [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Refreshing network info cache for port 7b623726-f68f-48cf-b049-c64d5bc7aa64 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 644.333362] env[59382]: DEBUG nova.network.neutron [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Updated VIF entry in instance network info cache for port 7b623726-f68f-48cf-b049-c64d5bc7aa64. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 644.333872] env[59382]: DEBUG nova.network.neutron [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Updating instance_info_cache with network_info: [{"id": "7b623726-f68f-48cf-b049-c64d5bc7aa64", "address": "fa:16:3e:96:47:fc", "network": {"id": "131a5057-cb26-4673-9d37-3a07fc981c97", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-186276812", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "01dee33932cb4b16bca7b28b4989593a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e028024-a9c1-4cae-8849-ea770a7ae0e4", "external-id": "nsx-vlan-transportzone-919", "segmentation_id": 919, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b623726-f6", "ovs_interfaceid": "7b623726-f68f-48cf-b049-c64d5bc7aa64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3a320330-cadf-446a-b3cd-56cdc7a3b9ce", "address": "fa:16:3e:58:3f:e1", "network": {"id": "aca0863f-8feb-4a9f-a438-a0f0ebb8da7c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-494902276", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "01dee33932cb4b16bca7b28b4989593a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "392517be-6cb8-4b5b-9a52-449bfe2e16f7", "external-id": "nsx-vlan-transportzone-351", "segmentation_id": 351, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a320330-ca", "ovs_interfaceid": "3a320330-cadf-446a-b3cd-56cdc7a3b9ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 644.348266] env[59382]: DEBUG oslo_concurrency.lockutils [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] Releasing lock "refresh_cache-feea4bca-d134-475f-81b9-c8415bacf1f1" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 644.348514] env[59382]: DEBUG nova.compute.manager [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Received event network-vif-plugged-3a320330-cadf-446a-b3cd-56cdc7a3b9ce {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 644.348704] env[59382]: DEBUG oslo_concurrency.lockutils [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] Acquiring lock "feea4bca-d134-475f-81b9-c8415bacf1f1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 644.348885] env[59382]: DEBUG oslo_concurrency.lockutils [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] Lock "feea4bca-d134-475f-81b9-c8415bacf1f1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 644.349053] env[59382]: DEBUG oslo_concurrency.lockutils [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] Lock "feea4bca-d134-475f-81b9-c8415bacf1f1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 644.349216] env[59382]: DEBUG nova.compute.manager [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] No waiting events found dispatching network-vif-plugged-3a320330-cadf-446a-b3cd-56cdc7a3b9ce {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 644.349377] env[59382]: WARNING nova.compute.manager [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Received unexpected event network-vif-plugged-3a320330-cadf-446a-b3cd-56cdc7a3b9ce for instance with vm_state building and task_state spawning. [ 644.349526] env[59382]: DEBUG nova.compute.manager [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Received event network-changed-3a320330-cadf-446a-b3cd-56cdc7a3b9ce {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 644.349674] env[59382]: DEBUG nova.compute.manager [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Refreshing instance network info cache due to event network-changed-3a320330-cadf-446a-b3cd-56cdc7a3b9ce. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 644.349924] env[59382]: DEBUG oslo_concurrency.lockutils [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] Acquiring lock "refresh_cache-feea4bca-d134-475f-81b9-c8415bacf1f1" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 644.350102] env[59382]: DEBUG oslo_concurrency.lockutils [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] Acquired lock "refresh_cache-feea4bca-d134-475f-81b9-c8415bacf1f1" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 644.350263] env[59382]: DEBUG nova.network.neutron [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Refreshing network info cache for port 3a320330-cadf-446a-b3cd-56cdc7a3b9ce {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 645.186536] env[59382]: DEBUG nova.network.neutron [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Updated VIF entry in instance network info cache for port 3a320330-cadf-446a-b3cd-56cdc7a3b9ce. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 645.186997] env[59382]: DEBUG nova.network.neutron [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Updating instance_info_cache with network_info: [{"id": "7b623726-f68f-48cf-b049-c64d5bc7aa64", "address": "fa:16:3e:96:47:fc", "network": {"id": "131a5057-cb26-4673-9d37-3a07fc981c97", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-186276812", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.35", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "01dee33932cb4b16bca7b28b4989593a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8e028024-a9c1-4cae-8849-ea770a7ae0e4", "external-id": "nsx-vlan-transportzone-919", "segmentation_id": 919, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b623726-f6", "ovs_interfaceid": "7b623726-f68f-48cf-b049-c64d5bc7aa64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3a320330-cadf-446a-b3cd-56cdc7a3b9ce", "address": "fa:16:3e:58:3f:e1", "network": {"id": "aca0863f-8feb-4a9f-a438-a0f0ebb8da7c", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-494902276", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.110", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "01dee33932cb4b16bca7b28b4989593a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "392517be-6cb8-4b5b-9a52-449bfe2e16f7", "external-id": "nsx-vlan-transportzone-351", "segmentation_id": 351, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3a320330-ca", "ovs_interfaceid": "3a320330-cadf-446a-b3cd-56cdc7a3b9ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 645.198273] env[59382]: DEBUG oslo_concurrency.lockutils [req-622a3111-44af-4a27-92aa-a02e26073a87 req-9450f890-6c3c-4674-ac91-0c40ef13daac service nova] Releasing lock "refresh_cache-feea4bca-d134-475f-81b9-c8415bacf1f1" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 645.859184] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7346418d-7d73-4322-b02b-b2a0bebe2751 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Acquiring lock "4f8eb28d-4e0a-4dbb-a965-f578de1f5f03" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 645.859484] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7346418d-7d73-4322-b02b-b2a0bebe2751 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Lock "4f8eb28d-4e0a-4dbb-a965-f578de1f5f03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.280454] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ef8a9254-23fc-4f3f-8927-46ba91da5003 tempest-SecurityGroupsTestJSON-1949994903 tempest-SecurityGroupsTestJSON-1949994903-project-member] Acquiring lock "8e4f0852-9a7b-48d6-ad8d-42df7445798f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.280454] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ef8a9254-23fc-4f3f-8927-46ba91da5003 tempest-SecurityGroupsTestJSON-1949994903 tempest-SecurityGroupsTestJSON-1949994903-project-member] Lock "8e4f0852-9a7b-48d6-ad8d-42df7445798f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 662.530248] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 663.527045] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 663.527349] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 664.522646] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 664.526319] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 664.526482] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Starting heal instance info cache {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9858}} [ 664.526603] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Rebuilding the list of instances to heal {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9862}} [ 664.550737] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 664.550916] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 664.551221] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 664.551396] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 664.551524] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 664.551649] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 664.551997] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 664.551997] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 664.552131] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 664.552209] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 664.552297] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Didn't find any instances for network info cache update. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9944}} [ 664.552777] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 664.552949] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 664.553093] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59382) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10477}} [ 664.553245] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 664.564773] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.565082] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.565210] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.565374] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 664.566592] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c75ede8-aa3a-4e88-826d-28adebaa1e2e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.576375] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a1dd893-3880-4bc7-9217-348c699aea29 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.592046] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a8598fe-850d-45f3-b87c-e0d205bd3966 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.599679] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a7ae24f-15b0-4c03-b691-1213238e2052 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.630892] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181249MB free_disk=170GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 664.631071] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.631318] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.698186] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 664.698447] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance dee2197a-8c39-4655-be3e-e20fb72f518a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 664.698581] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4d131062-1c01-4c64-bf26-d38bf9da59d6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 664.698712] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 664.698819] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4075452d-d1ef-4fb7-8fa1-50ef80998151 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 664.698939] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 6feee415-28ca-42b4-bd0a-ea5e531b117c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 664.699068] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance d31427c1-9979-4617-b5a1-43aee722d88d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 664.699186] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 93e239f1-44f6-4dfa-8634-50b933aaf9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 664.699298] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance c2f5545d-884a-4166-a93b-810ef311c2e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 664.699411] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance feea4bca-d134-475f-81b9-c8415bacf1f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 664.725453] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 3c235411-c50f-40b5-a681-ca42b7838506 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.749838] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance acae2ecc-9a00-4356-96d7-a7521ea46f32 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.761026] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance cf672665-36c7-4251-a32a-537b9d4c38ed has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.772131] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance d3d59ff4-eaa9-46b3-8279-50e5cfe740a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.784874] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 6ed2d21c-66d8-4549-b58e-0cbdcb518f48 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.795428] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 15f9e508-12f2-42db-b6d1-d9b154b94da3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.806543] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 165f09fe-9785-46dd-9016-53fc7838fc14 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.818458] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 1fd581d1-9f01-4428-9aac-edc1237e0541 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.829040] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance b086c8ed-73fb-4083-872f-f2d90b0e640f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.837812] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4f8eb28d-4e0a-4dbb-a965-f578de1f5f03 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.847499] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 8e4f0852-9a7b-48d6-ad8d-42df7445798f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 664.848784] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 664.848784] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 665.097693] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-689a95fb-0259-4b61-b9b8-a7b5ed4f70e1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.106047] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b9aade9-667c-439c-bc35-206649b9ae22 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.136829] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb5c1f6f-06ef-4111-9518-e9530f4c762f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.144187] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57c70d75-fdd3-423c-9565-dadf83463062 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.157465] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 665.165436] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 665.181082] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 665.181272] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.550s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 666.155455] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 666.824697] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a91678a2-35ff-4c2e-8a35-45dcdcf6391c tempest-ServersTestBootFromVolume-490524462 tempest-ServersTestBootFromVolume-490524462-project-member] Acquiring lock "8012e45c-a015-40c6-b45c-86f9cd5fe806" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.824927] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a91678a2-35ff-4c2e-8a35-45dcdcf6391c tempest-ServersTestBootFromVolume-490524462 tempest-ServersTestBootFromVolume-490524462-project-member] Lock "8012e45c-a015-40c6-b45c-86f9cd5fe806" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.147668] env[59382]: WARNING oslo_vmware.rw_handles [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 680.147668] env[59382]: ERROR oslo_vmware.rw_handles [ 680.148283] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/cc903673-84d5-4ede-b55b-0a6673dd99a0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 680.149759] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 680.150024] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Copying Virtual Disk [datastore1] vmware_temp/cc903673-84d5-4ede-b55b-0a6673dd99a0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/cc903673-84d5-4ede-b55b-0a6673dd99a0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 680.150304] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-867c15bd-d4b5-4223-a008-86af02846b72 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.158276] env[59382]: DEBUG oslo_vmware.api [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Waiting for the task: (returnval){ [ 680.158276] env[59382]: value = "task-2256719" [ 680.158276] env[59382]: _type = "Task" [ 680.158276] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 680.166717] env[59382]: DEBUG oslo_vmware.api [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Task: {'id': task-2256719, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 680.670028] env[59382]: DEBUG oslo_vmware.exceptions [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 680.670028] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 680.670028] env[59382]: ERROR nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 680.670028] env[59382]: Faults: ['InvalidArgument'] [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Traceback (most recent call last): [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] yield resources [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] self.driver.spawn(context, instance, image_meta, [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] self._fetch_image_if_missing(context, vi) [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] image_cache(vi, tmp_image_ds_loc) [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] vm_util.copy_virtual_disk( [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] session._wait_for_task(vmdk_copy_task) [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] return self.wait_for_task(task_ref) [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] return evt.wait() [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] result = hub.switch() [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] return self.greenlet.switch() [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] self.f(*self.args, **self.kw) [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] raise exceptions.translate_fault(task_info.error) [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Faults: ['InvalidArgument'] [ 680.670028] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] [ 680.670028] env[59382]: INFO nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Terminating instance [ 680.671297] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 680.671522] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 680.671744] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-09260465-ef20-415e-b92b-e513902ef33f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.673851] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 680.674047] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 680.674759] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4661e8c-b680-43af-9037-bd0945483893 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.682675] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 680.682870] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-28c3aef6-07a9-42fe-a4ad-b38b37e47424 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.684931] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 680.685106] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 680.685980] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-47584c09-547a-4704-a316-c8d631b0ab78 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.690380] env[59382]: DEBUG oslo_vmware.api [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Waiting for the task: (returnval){ [ 680.690380] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]522b7981-75b5-8c3d-a6fd-169963b84d15" [ 680.690380] env[59382]: _type = "Task" [ 680.690380] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 680.697143] env[59382]: DEBUG oslo_vmware.api [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]522b7981-75b5-8c3d-a6fd-169963b84d15, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 680.757413] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 680.757666] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 680.757802] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Deleting the datastore file [datastore1] 4d131062-1c01-4c64-bf26-d38bf9da59d6 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 680.758072] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c87a0e97-f6f5-4d26-ab72-30c694658a86 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.764691] env[59382]: DEBUG oslo_vmware.api [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Waiting for the task: (returnval){ [ 680.764691] env[59382]: value = "task-2256721" [ 680.764691] env[59382]: _type = "Task" [ 680.764691] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 680.772047] env[59382]: DEBUG oslo_vmware.api [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Task: {'id': task-2256721, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.200773] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 681.201055] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Creating directory with path [datastore1] vmware_temp/9ab53924-0566-48e0-afe0-89d5d2243814/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 681.201278] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-95aa82e9-9985-43f7-9ca2-1f6b988c6401 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.212192] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Created directory with path [datastore1] vmware_temp/9ab53924-0566-48e0-afe0-89d5d2243814/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 681.212378] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Fetch image to [datastore1] vmware_temp/9ab53924-0566-48e0-afe0-89d5d2243814/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 681.212549] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/9ab53924-0566-48e0-afe0-89d5d2243814/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 681.213353] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5caba6d4-b4de-4826-95e7-5396a0f7066e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.219822] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b5d9e2a-fa3e-47ed-a771-60c074e7a8e7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.228835] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c19289b-2e7f-493b-91d4-01dc728b611f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.258599] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba4b0faf-2ff6-4882-bfb5-6dc7dd35825e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.263952] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c00a33b1-7f43-41be-8337-944422bda7f6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.272858] env[59382]: DEBUG oslo_vmware.api [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Task: {'id': task-2256721, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065618} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 681.273104] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 681.273285] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 681.273456] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 681.273626] env[59382]: INFO nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Took 0.60 seconds to destroy the instance on the hypervisor. [ 681.275798] env[59382]: DEBUG nova.compute.claims [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 681.276910] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.276910] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.300561] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 681.349363] env[59382]: DEBUG oslo_vmware.rw_handles [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9ab53924-0566-48e0-afe0-89d5d2243814/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 681.407597] env[59382]: DEBUG oslo_vmware.rw_handles [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 681.407870] env[59382]: DEBUG oslo_vmware.rw_handles [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9ab53924-0566-48e0-afe0-89d5d2243814/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 681.609959] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-754ad8af-0a4b-48e2-9c40-d92ccceec147 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.619377] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c6ff624-bef3-4306-a918-feca321df9e0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.651133] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f8a00d7-9ff2-4cb0-9e76-e921c607bf7e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.659652] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a16e23c-c24b-4f9f-be37-e84bc99c8b7f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.671476] env[59382]: DEBUG nova.compute.provider_tree [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 681.680199] env[59382]: DEBUG nova.scheduler.client.report [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 681.693912] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.418s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.694467] env[59382]: ERROR nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 681.694467] env[59382]: Faults: ['InvalidArgument'] [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Traceback (most recent call last): [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] self.driver.spawn(context, instance, image_meta, [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] self._fetch_image_if_missing(context, vi) [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] image_cache(vi, tmp_image_ds_loc) [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] vm_util.copy_virtual_disk( [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] session._wait_for_task(vmdk_copy_task) [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] return self.wait_for_task(task_ref) [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] return evt.wait() [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] result = hub.switch() [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] return self.greenlet.switch() [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] self.f(*self.args, **self.kw) [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] raise exceptions.translate_fault(task_info.error) [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Faults: ['InvalidArgument'] [ 681.694467] env[59382]: ERROR nova.compute.manager [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] [ 681.695240] env[59382]: DEBUG nova.compute.utils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 681.696720] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Build of instance 4d131062-1c01-4c64-bf26-d38bf9da59d6 was re-scheduled: A specified parameter was not correct: fileType [ 681.696720] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 681.697102] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 681.697276] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 681.697431] env[59382]: DEBUG nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 681.697590] env[59382]: DEBUG nova.network.neutron [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 682.173366] env[59382]: DEBUG nova.network.neutron [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 682.187994] env[59382]: INFO nova.compute.manager [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] Took 0.49 seconds to deallocate network for instance. [ 682.279222] env[59382]: INFO nova.scheduler.client.report [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Deleted allocations for instance 4d131062-1c01-4c64-bf26-d38bf9da59d6 [ 682.298615] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1a777ff8-3272-4cbc-a12a-bf36e452a950 tempest-ServerDiagnosticsNegativeTest-1605801056 tempest-ServerDiagnosticsNegativeTest-1605801056-project-member] Lock "4d131062-1c01-4c64-bf26-d38bf9da59d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 146.712s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.299457] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "4d131062-1c01-4c64-bf26-d38bf9da59d6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 144.503s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.299646] env[59382]: INFO nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4d131062-1c01-4c64-bf26-d38bf9da59d6] During sync_power_state the instance has a pending task (spawning). Skip. [ 682.299816] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "4d131062-1c01-4c64-bf26-d38bf9da59d6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.324579] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 682.383671] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.383926] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.385516] env[59382]: INFO nova.compute.claims [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 682.667292] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a24ee0f-381f-4869-8dbc-5ff4ed088f5a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.674782] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9424f08f-1057-469f-8d5b-0b0a1e6eee2e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.704737] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4dbd259-9a01-44fe-b484-5384883cfcba {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.711878] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef0987ac-511a-4d27-8fc5-db9a0fb4138b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.724552] env[59382]: DEBUG nova.compute.provider_tree [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 682.732824] env[59382]: DEBUG nova.scheduler.client.report [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 682.746977] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.363s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.747536] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 682.778222] env[59382]: DEBUG nova.compute.utils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 682.779578] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 682.779753] env[59382]: DEBUG nova.network.neutron [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 682.789993] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 682.852523] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 682.866583] env[59382]: DEBUG nova.policy [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd29dd4deb5f84db1848dc4ec27a546e4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a096777081a4888a98f65e98b4fa740', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 682.877112] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 682.877112] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 682.877112] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 682.877112] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 682.877112] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 682.877112] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 682.877390] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 682.877658] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 682.877938] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 682.878226] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 682.878535] env[59382]: DEBUG nova.virt.hardware [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 682.879459] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de5cd9bd-0b9a-4cab-9093-1ba786e30c44 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.887147] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c70655e2-08fe-4e8c-a873-c89e0bf9f6e3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.176667] env[59382]: DEBUG nova.network.neutron [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Successfully created port: 14fa78ed-8777-450d-ac25-47967333f524 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 684.153225] env[59382]: DEBUG nova.network.neutron [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Successfully updated port: 14fa78ed-8777-450d-ac25-47967333f524 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 684.163025] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Acquiring lock "refresh_cache-3c235411-c50f-40b5-a681-ca42b7838506" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 684.163408] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Acquired lock "refresh_cache-3c235411-c50f-40b5-a681-ca42b7838506" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 684.163681] env[59382]: DEBUG nova.network.neutron [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 684.231118] env[59382]: DEBUG nova.network.neutron [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 684.673140] env[59382]: DEBUG nova.compute.manager [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Received event network-vif-plugged-14fa78ed-8777-450d-ac25-47967333f524 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 684.673792] env[59382]: DEBUG oslo_concurrency.lockutils [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] Acquiring lock "3c235411-c50f-40b5-a681-ca42b7838506-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.675480] env[59382]: DEBUG oslo_concurrency.lockutils [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] Lock "3c235411-c50f-40b5-a681-ca42b7838506-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.675480] env[59382]: DEBUG oslo_concurrency.lockutils [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] Lock "3c235411-c50f-40b5-a681-ca42b7838506-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.675480] env[59382]: DEBUG nova.compute.manager [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] No waiting events found dispatching network-vif-plugged-14fa78ed-8777-450d-ac25-47967333f524 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 684.675480] env[59382]: WARNING nova.compute.manager [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Received unexpected event network-vif-plugged-14fa78ed-8777-450d-ac25-47967333f524 for instance with vm_state building and task_state spawning. [ 684.675480] env[59382]: DEBUG nova.compute.manager [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Received event network-changed-14fa78ed-8777-450d-ac25-47967333f524 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 684.675480] env[59382]: DEBUG nova.compute.manager [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Refreshing instance network info cache due to event network-changed-14fa78ed-8777-450d-ac25-47967333f524. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 684.675480] env[59382]: DEBUG oslo_concurrency.lockutils [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] Acquiring lock "refresh_cache-3c235411-c50f-40b5-a681-ca42b7838506" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 684.700322] env[59382]: DEBUG nova.network.neutron [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Updating instance_info_cache with network_info: [{"id": "14fa78ed-8777-450d-ac25-47967333f524", "address": "fa:16:3e:c8:f9:65", "network": {"id": "b6f07d04-1d87-467a-8b73-fd536bbd123f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-592586233-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4a096777081a4888a98f65e98b4fa740", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee20e439-fed9-490e-97dd-f3c886977ae1", "external-id": "nsx-vlan-transportzone-357", "segmentation_id": 357, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap14fa78ed-87", "ovs_interfaceid": "14fa78ed-8777-450d-ac25-47967333f524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 684.713555] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Releasing lock "refresh_cache-3c235411-c50f-40b5-a681-ca42b7838506" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 684.713690] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Instance network_info: |[{"id": "14fa78ed-8777-450d-ac25-47967333f524", "address": "fa:16:3e:c8:f9:65", "network": {"id": "b6f07d04-1d87-467a-8b73-fd536bbd123f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-592586233-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4a096777081a4888a98f65e98b4fa740", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee20e439-fed9-490e-97dd-f3c886977ae1", "external-id": "nsx-vlan-transportzone-357", "segmentation_id": 357, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap14fa78ed-87", "ovs_interfaceid": "14fa78ed-8777-450d-ac25-47967333f524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 684.713960] env[59382]: DEBUG oslo_concurrency.lockutils [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] Acquired lock "refresh_cache-3c235411-c50f-40b5-a681-ca42b7838506" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 684.714153] env[59382]: DEBUG nova.network.neutron [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Refreshing network info cache for port 14fa78ed-8777-450d-ac25-47967333f524 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 684.715220] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c8:f9:65', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ee20e439-fed9-490e-97dd-f3c886977ae1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '14fa78ed-8777-450d-ac25-47967333f524', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 684.725262] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Creating folder: Project (4a096777081a4888a98f65e98b4fa740). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 684.726227] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-76affb6d-8bf3-4634-a41d-8a4ac614f8ad {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.739138] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Created folder: Project (4a096777081a4888a98f65e98b4fa740) in parent group-v459741. [ 684.739464] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Creating folder: Instances. Parent ref: group-v459782. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 684.739788] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0db1787e-5144-44ac-b4f5-c4daf2ecba0f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.749961] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Created folder: Instances in parent group-v459782. [ 684.749961] env[59382]: DEBUG oslo.service.loopingcall [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 684.749961] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 684.749961] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-38d21366-7d95-42af-930c-559c6b45af9e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.770073] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 684.770073] env[59382]: value = "task-2256724" [ 684.770073] env[59382]: _type = "Task" [ 684.770073] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 684.777288] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256724, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 685.177311] env[59382]: DEBUG nova.network.neutron [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Updated VIF entry in instance network info cache for port 14fa78ed-8777-450d-ac25-47967333f524. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 685.177757] env[59382]: DEBUG nova.network.neutron [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Updating instance_info_cache with network_info: [{"id": "14fa78ed-8777-450d-ac25-47967333f524", "address": "fa:16:3e:c8:f9:65", "network": {"id": "b6f07d04-1d87-467a-8b73-fd536bbd123f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-592586233-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4a096777081a4888a98f65e98b4fa740", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee20e439-fed9-490e-97dd-f3c886977ae1", "external-id": "nsx-vlan-transportzone-357", "segmentation_id": 357, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap14fa78ed-87", "ovs_interfaceid": "14fa78ed-8777-450d-ac25-47967333f524", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 685.191584] env[59382]: DEBUG oslo_concurrency.lockutils [req-fa36c99a-3142-4d41-a8d9-6e4834e9b893 req-460da651-2fce-4291-b375-6d3e3ebaf090 service nova] Releasing lock "refresh_cache-3c235411-c50f-40b5-a681-ca42b7838506" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 685.280700] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256724, 'name': CreateVM_Task, 'duration_secs': 0.303946} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 685.280700] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 685.280968] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 685.280968] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 685.281261] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 685.281510] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-57d10f52-920f-4a5d-8592-cf229e38bb15 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.286215] env[59382]: DEBUG oslo_vmware.api [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Waiting for the task: (returnval){ [ 685.286215] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]528110a4-8a45-295b-4dce-a13cf6dcc82a" [ 685.286215] env[59382]: _type = "Task" [ 685.286215] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 685.293994] env[59382]: DEBUG oslo_vmware.api [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]528110a4-8a45-295b-4dce-a13cf6dcc82a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 685.799656] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 685.800375] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 685.800949] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 723.527078] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 723.527078] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 723.527078] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 724.522524] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 724.526122] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 724.537314] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.537585] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.537771] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 724.537932] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 724.539008] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-669195e7-79eb-4fdb-b233-da44c3be7338 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.547779] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7eee99e-873c-45f1-84a8-d5cd7501123e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.561650] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bdc1457-1d12-4c86-b263-a283d30ec3eb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.568022] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-455b8189-5487-42ef-bdd4-44b42589b9a6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 724.596662] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181217MB free_disk=170GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 724.596814] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 724.597014] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 724.660133] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 724.660301] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance dee2197a-8c39-4655-be3e-e20fb72f518a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 724.660432] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 724.660607] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4075452d-d1ef-4fb7-8fa1-50ef80998151 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 724.660746] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 6feee415-28ca-42b4-bd0a-ea5e531b117c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 724.660865] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance d31427c1-9979-4617-b5a1-43aee722d88d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 724.660986] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 93e239f1-44f6-4dfa-8634-50b933aaf9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 724.661116] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance c2f5545d-884a-4166-a93b-810ef311c2e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 724.661241] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance feea4bca-d134-475f-81b9-c8415bacf1f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 724.661357] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 3c235411-c50f-40b5-a681-ca42b7838506 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 724.671794] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance acae2ecc-9a00-4356-96d7-a7521ea46f32 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.683052] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance cf672665-36c7-4251-a32a-537b9d4c38ed has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.692350] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance d3d59ff4-eaa9-46b3-8279-50e5cfe740a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.702568] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 6ed2d21c-66d8-4549-b58e-0cbdcb518f48 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.714250] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 15f9e508-12f2-42db-b6d1-d9b154b94da3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.723381] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 165f09fe-9785-46dd-9016-53fc7838fc14 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.732641] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 1fd581d1-9f01-4428-9aac-edc1237e0541 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.741790] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance b086c8ed-73fb-4083-872f-f2d90b0e640f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.752741] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4f8eb28d-4e0a-4dbb-a965-f578de1f5f03 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.762769] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 8e4f0852-9a7b-48d6-ad8d-42df7445798f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.773170] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 8012e45c-a015-40c6-b45c-86f9cd5fe806 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 724.773381] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 724.773523] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 725.015037] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1ecd179-7ea6-4e5c-8800-29c1cdd73a24 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.022936] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc84f121-89a4-444f-a524-01eb4263d4d8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.052217] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7a5a97f-4486-453c-9960-f7367cdadae6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.059263] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fc48013-d8aa-4f81-a02c-cf77327157b3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 725.072103] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 725.099122] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 725.112126] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 725.112333] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.515s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 726.108566] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 726.129086] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 726.129819] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Starting heal instance info cache {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9858}} [ 726.129819] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Rebuilding the list of instances to heal {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9862}} [ 726.150834] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 726.151012] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 726.151155] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 726.151286] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 726.151442] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 726.151608] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 726.151745] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 726.151905] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 726.152119] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 726.152286] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 726.152460] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Didn't find any instances for network info cache update. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9944}} [ 726.153201] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 726.153377] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 726.153546] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59382) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10477}} [ 727.527367] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 729.317575] env[59382]: WARNING oslo_vmware.rw_handles [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 729.317575] env[59382]: ERROR oslo_vmware.rw_handles [ 729.318242] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/9ab53924-0566-48e0-afe0-89d5d2243814/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 729.319849] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 729.320130] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Copying Virtual Disk [datastore1] vmware_temp/9ab53924-0566-48e0-afe0-89d5d2243814/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/9ab53924-0566-48e0-afe0-89d5d2243814/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 729.321686] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a8f1d7c7-a963-41af-8974-73042ee125b2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.330954] env[59382]: DEBUG oslo_vmware.api [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Waiting for the task: (returnval){ [ 729.330954] env[59382]: value = "task-2256725" [ 729.330954] env[59382]: _type = "Task" [ 729.330954] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 729.339683] env[59382]: DEBUG oslo_vmware.api [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Task: {'id': task-2256725, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 729.841609] env[59382]: DEBUG oslo_vmware.exceptions [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 729.841874] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 729.842433] env[59382]: ERROR nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 729.842433] env[59382]: Faults: ['InvalidArgument'] [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Traceback (most recent call last): [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] yield resources [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] self.driver.spawn(context, instance, image_meta, [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] self._fetch_image_if_missing(context, vi) [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] image_cache(vi, tmp_image_ds_loc) [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] vm_util.copy_virtual_disk( [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] session._wait_for_task(vmdk_copy_task) [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] return self.wait_for_task(task_ref) [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] return evt.wait() [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] result = hub.switch() [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] return self.greenlet.switch() [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] self.f(*self.args, **self.kw) [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] raise exceptions.translate_fault(task_info.error) [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Faults: ['InvalidArgument'] [ 729.842433] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] [ 729.843449] env[59382]: INFO nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Terminating instance [ 729.844337] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 729.844541] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 729.845169] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 729.845356] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 729.845602] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e437cfdd-d5a5-469a-bec1-1d392ed141f6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.848063] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6ae1490-daae-431d-b08e-b468ae4ec59a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.854639] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 729.854917] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ea0ca500-0ff3-4348-9319-25b981eaa337 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.857077] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 729.857248] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 729.858159] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fbc4d7cf-ce3e-426e-9286-8041e321fb90 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.862832] env[59382]: DEBUG oslo_vmware.api [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Waiting for the task: (returnval){ [ 729.862832] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52268b9d-461c-4c73-5ca5-3398eafedf67" [ 729.862832] env[59382]: _type = "Task" [ 729.862832] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 729.937947] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 729.938355] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 729.938667] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Deleting the datastore file [datastore1] dee2197a-8c39-4655-be3e-e20fb72f518a {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 729.938955] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-30e9e807-ab76-4ae7-ba6a-53b4eda8a07f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 729.945678] env[59382]: DEBUG oslo_vmware.api [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Waiting for the task: (returnval){ [ 729.945678] env[59382]: value = "task-2256727" [ 729.945678] env[59382]: _type = "Task" [ 729.945678] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 729.953660] env[59382]: DEBUG oslo_vmware.api [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Task: {'id': task-2256727, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 730.372934] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 730.373218] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Creating directory with path [datastore1] vmware_temp/179f12a2-4843-443a-86a0-c538acec45a5/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 730.373479] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-800456f4-6f59-44f0-9497-5ff5f6a1b4e3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.384733] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Created directory with path [datastore1] vmware_temp/179f12a2-4843-443a-86a0-c538acec45a5/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 730.384927] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Fetch image to [datastore1] vmware_temp/179f12a2-4843-443a-86a0-c538acec45a5/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 730.385105] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/179f12a2-4843-443a-86a0-c538acec45a5/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 730.385908] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-179e1ac2-56ce-4355-b860-02aa75cbcedc {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.392564] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e67cfd0-2639-4b81-97fe-691ce4566a4f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.401462] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d3c643e-3802-4fb2-8598-d684e09bf3f3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.432389] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19f7ac02-a98b-4249-93c3-8adc0a205060 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.438039] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3b6b7d5b-2fba-489a-9842-c28e9658c645 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.454196] env[59382]: DEBUG oslo_vmware.api [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Task: {'id': task-2256727, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069604} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 730.454449] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 730.454636] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 730.454793] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 730.454960] env[59382]: INFO nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 730.457244] env[59382]: DEBUG nova.compute.claims [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 730.457412] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 730.457625] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 730.470179] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 730.526198] env[59382]: DEBUG oslo_vmware.rw_handles [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/179f12a2-4843-443a-86a0-c538acec45a5/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 730.595470] env[59382]: DEBUG oslo_vmware.rw_handles [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 730.595729] env[59382]: DEBUG oslo_vmware.rw_handles [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/179f12a2-4843-443a-86a0-c538acec45a5/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 730.824237] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3918384f-be96-4827-b36e-60e39a50ec4b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.831556] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7996087d-9ed5-4de6-8482-fc6754b35726 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.860128] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6be1f706-2434-4fc9-8032-bd340b06df7b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.866689] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0493fafe-fff0-4e96-8abd-ec2266219504 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 730.879448] env[59382]: DEBUG nova.compute.provider_tree [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 730.887838] env[59382]: DEBUG nova.scheduler.client.report [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 730.904127] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.446s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 730.904127] env[59382]: ERROR nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 730.904127] env[59382]: Faults: ['InvalidArgument'] [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Traceback (most recent call last): [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] self.driver.spawn(context, instance, image_meta, [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] self._fetch_image_if_missing(context, vi) [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] image_cache(vi, tmp_image_ds_loc) [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] vm_util.copy_virtual_disk( [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] session._wait_for_task(vmdk_copy_task) [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] return self.wait_for_task(task_ref) [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] return evt.wait() [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] result = hub.switch() [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] return self.greenlet.switch() [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] self.f(*self.args, **self.kw) [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] raise exceptions.translate_fault(task_info.error) [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Faults: ['InvalidArgument'] [ 730.904127] env[59382]: ERROR nova.compute.manager [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] [ 730.905083] env[59382]: DEBUG nova.compute.utils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 730.906477] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Build of instance dee2197a-8c39-4655-be3e-e20fb72f518a was re-scheduled: A specified parameter was not correct: fileType [ 730.906477] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 730.906851] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 730.907038] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 730.907196] env[59382]: DEBUG nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 730.907353] env[59382]: DEBUG nova.network.neutron [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 731.573051] env[59382]: DEBUG nova.network.neutron [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 731.585398] env[59382]: INFO nova.compute.manager [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] Took 0.68 seconds to deallocate network for instance. [ 731.680587] env[59382]: INFO nova.scheduler.client.report [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Deleted allocations for instance dee2197a-8c39-4655-be3e-e20fb72f518a [ 731.695993] env[59382]: DEBUG oslo_concurrency.lockutils [None req-de4b1a28-ad62-4983-b6a6-4ae91cf6da38 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "dee2197a-8c39-4655-be3e-e20fb72f518a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 195.035s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.697156] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "dee2197a-8c39-4655-be3e-e20fb72f518a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 193.901s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.697342] env[59382]: INFO nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: dee2197a-8c39-4655-be3e-e20fb72f518a] During sync_power_state the instance has a pending task (spawning). Skip. [ 731.697511] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "dee2197a-8c39-4655-be3e-e20fb72f518a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 731.718833] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 731.774062] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 731.774162] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 731.775697] env[59382]: INFO nova.compute.claims [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 732.065796] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07a14660-9468-415e-9be2-95e0ef7f65ec {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.073264] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a162b06-6f89-43ed-9f4d-eb9ba46e910c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.103539] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0be1fd38-d836-435d-833a-038d676e4d7b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.110679] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a7b6b8d-3fe5-44a7-afe8-fd4d1d43a089 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.124939] env[59382]: DEBUG nova.compute.provider_tree [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 732.133715] env[59382]: DEBUG nova.scheduler.client.report [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 732.147794] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.374s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 732.148307] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 732.181823] env[59382]: DEBUG nova.compute.utils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 732.184237] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 732.184430] env[59382]: DEBUG nova.network.neutron [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 732.194487] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 732.243159] env[59382]: DEBUG nova.policy [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '106bc74d926141f984f8ff5a75a03c47', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd0e8413e81554aa0b236a72dfdbecc72', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 732.265062] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 732.286257] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 732.286585] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 732.286756] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 732.286942] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 732.287102] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 732.287260] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 732.287469] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 732.287626] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 732.287786] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 732.287945] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 732.288140] env[59382]: DEBUG nova.virt.hardware [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 732.288993] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f43011f0-02ee-49f4-b2ec-59beb250826b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.297377] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ba449c3-3eaf-47f2-bf01-c31e67290a0e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.728074] env[59382]: DEBUG nova.network.neutron [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Successfully created port: b2a7649c-f403-42fb-9299-2637927d4fc9 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 733.872834] env[59382]: DEBUG nova.compute.manager [req-90bee844-53d5-4e82-bf92-1f70df1a65b8 req-5050aafa-e15c-4372-b34e-45e3bc8f1873 service nova] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Received event network-vif-plugged-b2a7649c-f403-42fb-9299-2637927d4fc9 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 733.872834] env[59382]: DEBUG oslo_concurrency.lockutils [req-90bee844-53d5-4e82-bf92-1f70df1a65b8 req-5050aafa-e15c-4372-b34e-45e3bc8f1873 service nova] Acquiring lock "acae2ecc-9a00-4356-96d7-a7521ea46f32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.872834] env[59382]: DEBUG oslo_concurrency.lockutils [req-90bee844-53d5-4e82-bf92-1f70df1a65b8 req-5050aafa-e15c-4372-b34e-45e3bc8f1873 service nova] Lock "acae2ecc-9a00-4356-96d7-a7521ea46f32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.872834] env[59382]: DEBUG oslo_concurrency.lockutils [req-90bee844-53d5-4e82-bf92-1f70df1a65b8 req-5050aafa-e15c-4372-b34e-45e3bc8f1873 service nova] Lock "acae2ecc-9a00-4356-96d7-a7521ea46f32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 733.872834] env[59382]: DEBUG nova.compute.manager [req-90bee844-53d5-4e82-bf92-1f70df1a65b8 req-5050aafa-e15c-4372-b34e-45e3bc8f1873 service nova] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] No waiting events found dispatching network-vif-plugged-b2a7649c-f403-42fb-9299-2637927d4fc9 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 733.872834] env[59382]: WARNING nova.compute.manager [req-90bee844-53d5-4e82-bf92-1f70df1a65b8 req-5050aafa-e15c-4372-b34e-45e3bc8f1873 service nova] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Received unexpected event network-vif-plugged-b2a7649c-f403-42fb-9299-2637927d4fc9 for instance with vm_state building and task_state spawning. [ 733.899996] env[59382]: DEBUG nova.network.neutron [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Successfully updated port: b2a7649c-f403-42fb-9299-2637927d4fc9 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 733.907625] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Acquiring lock "refresh_cache-acae2ecc-9a00-4356-96d7-a7521ea46f32" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 733.909129] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Acquired lock "refresh_cache-acae2ecc-9a00-4356-96d7-a7521ea46f32" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.909129] env[59382]: DEBUG nova.network.neutron [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 733.987734] env[59382]: DEBUG nova.network.neutron [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 734.458791] env[59382]: DEBUG nova.network.neutron [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Updating instance_info_cache with network_info: [{"id": "b2a7649c-f403-42fb-9299-2637927d4fc9", "address": "fa:16:3e:7b:af:0f", "network": {"id": "344cc449-44f4-4279-9dc8-42a55736838c", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2110054968-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0e8413e81554aa0b236a72dfdbecc72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "255460d5-71d4-4bfd-87f1-acc10085db7f", "external-id": "nsx-vlan-transportzone-152", "segmentation_id": 152, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2a7649c-f4", "ovs_interfaceid": "b2a7649c-f403-42fb-9299-2637927d4fc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.472922] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Releasing lock "refresh_cache-acae2ecc-9a00-4356-96d7-a7521ea46f32" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 734.473255] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Instance network_info: |[{"id": "b2a7649c-f403-42fb-9299-2637927d4fc9", "address": "fa:16:3e:7b:af:0f", "network": {"id": "344cc449-44f4-4279-9dc8-42a55736838c", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2110054968-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0e8413e81554aa0b236a72dfdbecc72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "255460d5-71d4-4bfd-87f1-acc10085db7f", "external-id": "nsx-vlan-transportzone-152", "segmentation_id": 152, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2a7649c-f4", "ovs_interfaceid": "b2a7649c-f403-42fb-9299-2637927d4fc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 734.473660] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7b:af:0f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '255460d5-71d4-4bfd-87f1-acc10085db7f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b2a7649c-f403-42fb-9299-2637927d4fc9', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 734.482403] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Creating folder: Project (d0e8413e81554aa0b236a72dfdbecc72). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 734.483327] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b2bf886f-32d6-4904-85e6-c9cb7ccca880 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.495282] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Created folder: Project (d0e8413e81554aa0b236a72dfdbecc72) in parent group-v459741. [ 734.495557] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Creating folder: Instances. Parent ref: group-v459785. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 734.495764] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1cd06926-2ce4-48e8-948b-f7f5937fdcf8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.504922] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Created folder: Instances in parent group-v459785. [ 734.505307] env[59382]: DEBUG oslo.service.loopingcall [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 734.505592] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 734.507508] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-80c94f44-8b6b-4126-ba47-df39025cc186 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.527594] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 734.527594] env[59382]: value = "task-2256730" [ 734.527594] env[59382]: _type = "Task" [ 734.527594] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 734.536663] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256730, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 734.584161] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "03203308-bbd5-4adf-80a3-e851b9341f62" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.584325] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "03203308-bbd5-4adf-80a3-e851b9341f62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 735.039160] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256730, 'name': CreateVM_Task, 'duration_secs': 0.287125} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 735.039478] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 735.039982] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 735.040169] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 735.040484] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 735.040732] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-253fc7a6-c85c-45ca-a208-a14d2dae9714 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.045344] env[59382]: DEBUG oslo_vmware.api [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Waiting for the task: (returnval){ [ 735.045344] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52e1f4e4-d03e-1d25-8d3a-db5170628490" [ 735.045344] env[59382]: _type = "Task" [ 735.045344] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 735.052737] env[59382]: DEBUG oslo_vmware.api [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52e1f4e4-d03e-1d25-8d3a-db5170628490, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 735.168136] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquiring lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 735.555831] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 735.556207] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 735.556454] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 735.982431] env[59382]: DEBUG nova.compute.manager [req-5f4b6bd2-d262-4dc2-9ee8-5bb6b30c1736 req-0b4220d5-d80c-46cf-8406-b89abdd31b9f service nova] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Received event network-changed-b2a7649c-f403-42fb-9299-2637927d4fc9 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 735.982627] env[59382]: DEBUG nova.compute.manager [req-5f4b6bd2-d262-4dc2-9ee8-5bb6b30c1736 req-0b4220d5-d80c-46cf-8406-b89abdd31b9f service nova] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Refreshing instance network info cache due to event network-changed-b2a7649c-f403-42fb-9299-2637927d4fc9. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 735.982905] env[59382]: DEBUG oslo_concurrency.lockutils [req-5f4b6bd2-d262-4dc2-9ee8-5bb6b30c1736 req-0b4220d5-d80c-46cf-8406-b89abdd31b9f service nova] Acquiring lock "refresh_cache-acae2ecc-9a00-4356-96d7-a7521ea46f32" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 735.983077] env[59382]: DEBUG oslo_concurrency.lockutils [req-5f4b6bd2-d262-4dc2-9ee8-5bb6b30c1736 req-0b4220d5-d80c-46cf-8406-b89abdd31b9f service nova] Acquired lock "refresh_cache-acae2ecc-9a00-4356-96d7-a7521ea46f32" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 735.983246] env[59382]: DEBUG nova.network.neutron [req-5f4b6bd2-d262-4dc2-9ee8-5bb6b30c1736 req-0b4220d5-d80c-46cf-8406-b89abdd31b9f service nova] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Refreshing network info cache for port b2a7649c-f403-42fb-9299-2637927d4fc9 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 736.278265] env[59382]: DEBUG oslo_concurrency.lockutils [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquiring lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.356381] env[59382]: DEBUG nova.network.neutron [req-5f4b6bd2-d262-4dc2-9ee8-5bb6b30c1736 req-0b4220d5-d80c-46cf-8406-b89abdd31b9f service nova] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Updated VIF entry in instance network info cache for port b2a7649c-f403-42fb-9299-2637927d4fc9. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 736.356763] env[59382]: DEBUG nova.network.neutron [req-5f4b6bd2-d262-4dc2-9ee8-5bb6b30c1736 req-0b4220d5-d80c-46cf-8406-b89abdd31b9f service nova] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Updating instance_info_cache with network_info: [{"id": "b2a7649c-f403-42fb-9299-2637927d4fc9", "address": "fa:16:3e:7b:af:0f", "network": {"id": "344cc449-44f4-4279-9dc8-42a55736838c", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-2110054968-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d0e8413e81554aa0b236a72dfdbecc72", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "255460d5-71d4-4bfd-87f1-acc10085db7f", "external-id": "nsx-vlan-transportzone-152", "segmentation_id": 152, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2a7649c-f4", "ovs_interfaceid": "b2a7649c-f403-42fb-9299-2637927d4fc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.366245] env[59382]: DEBUG oslo_concurrency.lockutils [req-5f4b6bd2-d262-4dc2-9ee8-5bb6b30c1736 req-0b4220d5-d80c-46cf-8406-b89abdd31b9f service nova] Releasing lock "refresh_cache-acae2ecc-9a00-4356-96d7-a7521ea46f32" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 738.302208] env[59382]: DEBUG oslo_concurrency.lockutils [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "4075452d-d1ef-4fb7-8fa1-50ef80998151" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.973897] env[59382]: DEBUG oslo_concurrency.lockutils [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquiring lock "6feee415-28ca-42b4-bd0a-ea5e531b117c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 741.618069] env[59382]: DEBUG oslo_concurrency.lockutils [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquiring lock "d31427c1-9979-4617-b5a1-43aee722d88d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 750.310888] env[59382]: DEBUG oslo_concurrency.lockutils [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquiring lock "93e239f1-44f6-4dfa-8634-50b933aaf9bd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.030832] env[59382]: WARNING oslo_vmware.rw_handles [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 778.030832] env[59382]: ERROR oslo_vmware.rw_handles [ 778.031579] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/179f12a2-4843-443a-86a0-c538acec45a5/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 778.033165] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 778.033455] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Copying Virtual Disk [datastore1] vmware_temp/179f12a2-4843-443a-86a0-c538acec45a5/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/179f12a2-4843-443a-86a0-c538acec45a5/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 778.033778] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-853aa13e-eb0f-46a6-8fcd-ba75a11d3862 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.042774] env[59382]: DEBUG oslo_vmware.api [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Waiting for the task: (returnval){ [ 778.042774] env[59382]: value = "task-2256731" [ 778.042774] env[59382]: _type = "Task" [ 778.042774] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 778.051000] env[59382]: DEBUG oslo_vmware.api [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Task: {'id': task-2256731, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 778.553605] env[59382]: DEBUG oslo_vmware.exceptions [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 778.553881] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 778.554438] env[59382]: ERROR nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 778.554438] env[59382]: Faults: ['InvalidArgument'] [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Traceback (most recent call last): [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] yield resources [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] self.driver.spawn(context, instance, image_meta, [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] self._fetch_image_if_missing(context, vi) [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] image_cache(vi, tmp_image_ds_loc) [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] vm_util.copy_virtual_disk( [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] session._wait_for_task(vmdk_copy_task) [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] return self.wait_for_task(task_ref) [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] return evt.wait() [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] result = hub.switch() [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] return self.greenlet.switch() [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] self.f(*self.args, **self.kw) [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] raise exceptions.translate_fault(task_info.error) [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Faults: ['InvalidArgument'] [ 778.554438] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] [ 778.556109] env[59382]: INFO nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Terminating instance [ 778.556398] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 778.556535] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 778.556770] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-78b6633a-0c4c-4627-a47d-3dcc0f7bb7c3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.559023] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 778.559190] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 778.559895] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cffef42-dc80-435a-9b30-cd3c991ef335 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.567185] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 778.568164] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1f5ba43d-9a6d-42e0-b230-6756db335055 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.569614] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 778.569781] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 778.570451] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-94966203-584a-4f8e-b6f5-54882642e18d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.575257] env[59382]: DEBUG oslo_vmware.api [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Waiting for the task: (returnval){ [ 778.575257] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5293e4ea-4dcb-130e-a093-01a87912bb29" [ 778.575257] env[59382]: _type = "Task" [ 778.575257] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 778.582157] env[59382]: DEBUG oslo_vmware.api [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5293e4ea-4dcb-130e-a093-01a87912bb29, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 778.647409] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 778.647581] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 778.647722] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Deleting the datastore file [datastore1] f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 778.648019] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f4b061ab-56e0-452a-851e-ba5ffcb9f222 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.654949] env[59382]: DEBUG oslo_vmware.api [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Waiting for the task: (returnval){ [ 778.654949] env[59382]: value = "task-2256733" [ 778.654949] env[59382]: _type = "Task" [ 778.654949] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 778.663569] env[59382]: DEBUG oslo_vmware.api [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Task: {'id': task-2256733, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 779.085241] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 779.085563] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Creating directory with path [datastore1] vmware_temp/0022ed94-b0c7-4e8d-94a1-6e1f0a9149c0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 779.085677] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8842428b-3b08-42df-aaaf-30375ee92407 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.096481] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Created directory with path [datastore1] vmware_temp/0022ed94-b0c7-4e8d-94a1-6e1f0a9149c0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 779.096666] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Fetch image to [datastore1] vmware_temp/0022ed94-b0c7-4e8d-94a1-6e1f0a9149c0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 779.096943] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/0022ed94-b0c7-4e8d-94a1-6e1f0a9149c0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 779.097689] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f0658b9-0572-480a-ac28-388892cd1eef {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.103835] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c6f4377-4182-46d5-affc-449f19772725 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.112537] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40fd645e-38e7-494c-8271-e30af5305641 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.142531] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b47080a-214e-4129-bbb7-1f1ed0f0cc39 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.147616] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4e22522d-b0c7-4a15-8302-56617ca165d0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.162688] env[59382]: DEBUG oslo_vmware.api [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Task: {'id': task-2256733, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074998} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 779.162953] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 779.163193] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 779.163367] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 779.163573] env[59382]: INFO nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 779.165605] env[59382]: DEBUG nova.compute.claims [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 779.165770] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 779.165978] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 779.169777] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 779.218059] env[59382]: DEBUG oslo_vmware.rw_handles [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0022ed94-b0c7-4e8d-94a1-6e1f0a9149c0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 779.280807] env[59382]: DEBUG oslo_vmware.rw_handles [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 779.280996] env[59382]: DEBUG oslo_vmware.rw_handles [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0022ed94-b0c7-4e8d-94a1-6e1f0a9149c0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 779.489073] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ea8ae37-d590-4aed-a156-c42a05e129ad {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.495390] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7775b04-48d0-401e-8125-90027a61f47b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.528047] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26f9751d-7a52-4c04-9fc9-d9e616c0c952 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.535472] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-302cba55-0f54-46f4-acdb-c93da7e72719 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 779.549090] env[59382]: DEBUG nova.compute.provider_tree [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 779.556755] env[59382]: DEBUG nova.scheduler.client.report [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 779.573773] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.407s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.573773] env[59382]: ERROR nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 779.573773] env[59382]: Faults: ['InvalidArgument'] [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Traceback (most recent call last): [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] self.driver.spawn(context, instance, image_meta, [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] self._fetch_image_if_missing(context, vi) [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] image_cache(vi, tmp_image_ds_loc) [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] vm_util.copy_virtual_disk( [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] session._wait_for_task(vmdk_copy_task) [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] return self.wait_for_task(task_ref) [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] return evt.wait() [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] result = hub.switch() [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] return self.greenlet.switch() [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] self.f(*self.args, **self.kw) [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] raise exceptions.translate_fault(task_info.error) [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Faults: ['InvalidArgument'] [ 779.573773] env[59382]: ERROR nova.compute.manager [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] [ 779.575076] env[59382]: DEBUG nova.compute.utils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 779.575661] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Build of instance f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0 was re-scheduled: A specified parameter was not correct: fileType [ 779.575661] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 779.576142] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 779.576312] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 779.576479] env[59382]: DEBUG nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 779.576639] env[59382]: DEBUG nova.network.neutron [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 780.011277] env[59382]: DEBUG nova.network.neutron [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 780.025968] env[59382]: INFO nova.compute.manager [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Took 0.45 seconds to deallocate network for instance. [ 780.110818] env[59382]: INFO nova.scheduler.client.report [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Deleted allocations for instance f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0 [ 780.130046] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6da2f0f9-61cb-41f3-95b9-598829055f22 tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 245.131s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 780.131114] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 242.335s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 780.131315] env[59382]: INFO nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] During sync_power_state the instance has a pending task (spawning). Skip. [ 780.131487] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 780.131926] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 44.964s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 780.132160] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Acquiring lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 780.132361] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 780.135672] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 780.135672] env[59382]: INFO nova.compute.manager [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Terminating instance [ 780.135940] env[59382]: DEBUG nova.compute.manager [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 780.136348] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 780.136421] env[59382]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ab7a2ee5-5f1d-4f4a-ae77-e21a40010be9 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.146201] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51371364-f348-4b64-87bf-8f3948cddab1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.156993] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 780.176175] env[59382]: WARNING nova.virt.vmwareapi.vmops [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0 could not be found. [ 780.176374] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 780.176546] env[59382]: INFO nova.compute.manager [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 780.176779] env[59382]: DEBUG oslo.service.loopingcall [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 780.177017] env[59382]: DEBUG nova.compute.manager [-] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 780.177125] env[59382]: DEBUG nova.network.neutron [-] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 780.200256] env[59382]: DEBUG nova.network.neutron [-] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 780.207564] env[59382]: INFO nova.compute.manager [-] [instance: f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0] Took 0.03 seconds to deallocate network for instance. [ 780.209346] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 780.209574] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 780.210938] env[59382]: INFO nova.compute.claims [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 780.313550] env[59382]: DEBUG oslo_concurrency.lockutils [None req-6057e04e-8bc6-422f-b9cb-d0045932534a tempest-ImagesNegativeTestJSON-203578987 tempest-ImagesNegativeTestJSON-203578987-project-member] Lock "f84ee7f2-8bdd-4d87-8750-6df9d88dd5a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 780.463425] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4745cd72-091b-4184-9eee-09fbccb1e497 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.470806] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d33abe89-63d8-492c-a3e8-0a42b5354e1f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.499804] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5e686a5-5fcc-44b8-a129-f439af1bf3ab {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.507064] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b44c2d41-87dc-4d05-90fa-345c9486e3fb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.519604] env[59382]: DEBUG nova.compute.provider_tree [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 780.527622] env[59382]: DEBUG nova.scheduler.client.report [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 780.543127] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.333s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 780.543596] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 780.574282] env[59382]: DEBUG nova.compute.utils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 780.576184] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 780.576184] env[59382]: DEBUG nova.network.neutron [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 780.586153] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 780.646285] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 780.674486] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 780.674730] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 780.674914] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 780.675140] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 780.675269] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 780.675421] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 780.675615] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 780.675771] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 780.675925] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 780.676095] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 780.676270] env[59382]: DEBUG nova.virt.hardware [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 780.677184] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eafad5a-1a8e-425f-89a2-46071a0c8b95 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.685776] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92704da7-b85d-4e50-9f42-1197897e04f2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.690764] env[59382]: DEBUG nova.policy [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8de534386cf421ba7afa7eeb201c073', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a9a558b318b04e2d84893ce991b745cb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 781.425272] env[59382]: DEBUG nova.network.neutron [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Successfully created port: d00ded62-3944-4584-bf32-4c4f75d397ee {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 782.364726] env[59382]: DEBUG nova.network.neutron [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Successfully updated port: d00ded62-3944-4584-bf32-4c4f75d397ee {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 782.369369] env[59382]: DEBUG nova.compute.manager [req-e5b90577-cd12-4d72-aa1c-d1f2145b367e req-05b6855a-e4e4-4313-a03a-9d5055c947d6 service nova] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Received event network-vif-plugged-d00ded62-3944-4584-bf32-4c4f75d397ee {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 782.369702] env[59382]: DEBUG oslo_concurrency.lockutils [req-e5b90577-cd12-4d72-aa1c-d1f2145b367e req-05b6855a-e4e4-4313-a03a-9d5055c947d6 service nova] Acquiring lock "cf672665-36c7-4251-a32a-537b9d4c38ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 782.369821] env[59382]: DEBUG oslo_concurrency.lockutils [req-e5b90577-cd12-4d72-aa1c-d1f2145b367e req-05b6855a-e4e4-4313-a03a-9d5055c947d6 service nova] Lock "cf672665-36c7-4251-a32a-537b9d4c38ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 782.369932] env[59382]: DEBUG oslo_concurrency.lockutils [req-e5b90577-cd12-4d72-aa1c-d1f2145b367e req-05b6855a-e4e4-4313-a03a-9d5055c947d6 service nova] Lock "cf672665-36c7-4251-a32a-537b9d4c38ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 782.370147] env[59382]: DEBUG nova.compute.manager [req-e5b90577-cd12-4d72-aa1c-d1f2145b367e req-05b6855a-e4e4-4313-a03a-9d5055c947d6 service nova] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] No waiting events found dispatching network-vif-plugged-d00ded62-3944-4584-bf32-4c4f75d397ee {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 782.370304] env[59382]: WARNING nova.compute.manager [req-e5b90577-cd12-4d72-aa1c-d1f2145b367e req-05b6855a-e4e4-4313-a03a-9d5055c947d6 service nova] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Received unexpected event network-vif-plugged-d00ded62-3944-4584-bf32-4c4f75d397ee for instance with vm_state building and task_state spawning. [ 782.371435] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "refresh_cache-cf672665-36c7-4251-a32a-537b9d4c38ed" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 782.371557] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquired lock "refresh_cache-cf672665-36c7-4251-a32a-537b9d4c38ed" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 782.371692] env[59382]: DEBUG nova.network.neutron [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 782.426313] env[59382]: DEBUG nova.network.neutron [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 782.526312] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 782.526490] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Cleaning up deleted instances {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11145}} [ 782.543874] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] There are 0 instances to clean {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11154}} [ 782.544307] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 782.544307] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Cleaning up deleted instances with incomplete migration {{(pid=59382) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11183}} [ 782.559256] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 782.640913] env[59382]: DEBUG nova.network.neutron [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Updating instance_info_cache with network_info: [{"id": "d00ded62-3944-4584-bf32-4c4f75d397ee", "address": "fa:16:3e:ed:60:56", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.170", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd00ded62-39", "ovs_interfaceid": "d00ded62-3944-4584-bf32-4c4f75d397ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 782.652604] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Releasing lock "refresh_cache-cf672665-36c7-4251-a32a-537b9d4c38ed" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 782.652876] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Instance network_info: |[{"id": "d00ded62-3944-4584-bf32-4c4f75d397ee", "address": "fa:16:3e:ed:60:56", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.170", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd00ded62-39", "ovs_interfaceid": "d00ded62-3944-4584-bf32-4c4f75d397ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 782.653254] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ed:60:56', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '97b68ed7-8461-4345-b064-96a1dde53a86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd00ded62-3944-4584-bf32-4c4f75d397ee', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 782.660874] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Creating folder: Project (a9a558b318b04e2d84893ce991b745cb). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 782.661660] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b4605997-70a7-42fd-b635-c78be13ae956 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.672905] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Created folder: Project (a9a558b318b04e2d84893ce991b745cb) in parent group-v459741. [ 782.673020] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Creating folder: Instances. Parent ref: group-v459788. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 782.673175] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a531c77d-b947-4cf0-9940-c912cbab0761 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.681593] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Created folder: Instances in parent group-v459788. [ 782.681816] env[59382]: DEBUG oslo.service.loopingcall [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 782.681990] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 782.682194] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-781f1b74-fc82-4ebd-9317-83875716b1c3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.702908] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 782.702908] env[59382]: value = "task-2256736" [ 782.702908] env[59382]: _type = "Task" [ 782.702908] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 782.710649] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256736, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 783.212523] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256736, 'name': CreateVM_Task, 'duration_secs': 0.299737} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 783.212701] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 783.213390] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 783.213558] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 783.213876] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 783.214179] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a4e39697-19fe-4ca9-9e80-4b5a2cf5430b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 783.219150] env[59382]: DEBUG oslo_vmware.api [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Waiting for the task: (returnval){ [ 783.219150] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52664ee4-ef38-8d19-5b48-dc83c4e89182" [ 783.219150] env[59382]: _type = "Task" [ 783.219150] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 783.226945] env[59382]: DEBUG oslo_vmware.api [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52664ee4-ef38-8d19-5b48-dc83c4e89182, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 783.361948] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquiring lock "203e8cdb-621d-461a-97ba-3e3782f04d1d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 783.362206] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Lock "203e8cdb-621d-461a-97ba-3e3782f04d1d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 783.568425] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 783.729306] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 783.729629] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 783.729849] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 784.394034] env[59382]: DEBUG nova.compute.manager [req-2d8cd396-5973-4ae7-98d7-f381695f069c req-e5559ad9-0f38-4985-947d-fe7cfb1bd4d3 service nova] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Received event network-changed-d00ded62-3944-4584-bf32-4c4f75d397ee {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 784.394291] env[59382]: DEBUG nova.compute.manager [req-2d8cd396-5973-4ae7-98d7-f381695f069c req-e5559ad9-0f38-4985-947d-fe7cfb1bd4d3 service nova] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Refreshing instance network info cache due to event network-changed-d00ded62-3944-4584-bf32-4c4f75d397ee. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 784.394511] env[59382]: DEBUG oslo_concurrency.lockutils [req-2d8cd396-5973-4ae7-98d7-f381695f069c req-e5559ad9-0f38-4985-947d-fe7cfb1bd4d3 service nova] Acquiring lock "refresh_cache-cf672665-36c7-4251-a32a-537b9d4c38ed" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 784.394654] env[59382]: DEBUG oslo_concurrency.lockutils [req-2d8cd396-5973-4ae7-98d7-f381695f069c req-e5559ad9-0f38-4985-947d-fe7cfb1bd4d3 service nova] Acquired lock "refresh_cache-cf672665-36c7-4251-a32a-537b9d4c38ed" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 784.394810] env[59382]: DEBUG nova.network.neutron [req-2d8cd396-5973-4ae7-98d7-f381695f069c req-e5559ad9-0f38-4985-947d-fe7cfb1bd4d3 service nova] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Refreshing network info cache for port d00ded62-3944-4584-bf32-4c4f75d397ee {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 784.526792] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 784.960505] env[59382]: DEBUG nova.network.neutron [req-2d8cd396-5973-4ae7-98d7-f381695f069c req-e5559ad9-0f38-4985-947d-fe7cfb1bd4d3 service nova] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Updated VIF entry in instance network info cache for port d00ded62-3944-4584-bf32-4c4f75d397ee. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 784.960876] env[59382]: DEBUG nova.network.neutron [req-2d8cd396-5973-4ae7-98d7-f381695f069c req-e5559ad9-0f38-4985-947d-fe7cfb1bd4d3 service nova] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Updating instance_info_cache with network_info: [{"id": "d00ded62-3944-4584-bf32-4c4f75d397ee", "address": "fa:16:3e:ed:60:56", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.170", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd00ded62-39", "ovs_interfaceid": "d00ded62-3944-4584-bf32-4c4f75d397ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 784.970616] env[59382]: DEBUG oslo_concurrency.lockutils [req-2d8cd396-5973-4ae7-98d7-f381695f069c req-e5559ad9-0f38-4985-947d-fe7cfb1bd4d3 service nova] Releasing lock "refresh_cache-cf672665-36c7-4251-a32a-537b9d4c38ed" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 785.522137] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 785.526775] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 786.526804] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 786.527128] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Starting heal instance info cache {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9858}} [ 786.527182] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Rebuilding the list of instances to heal {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9862}} [ 786.546287] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 786.546399] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 786.546530] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 786.546655] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 786.546776] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 786.546900] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 786.547032] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 786.547265] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 786.547403] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 786.547527] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 786.547649] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Didn't find any instances for network info cache update. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9944}} [ 786.548162] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 786.556980] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 786.557254] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 786.557420] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 786.557588] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 786.558603] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fae4d86-ed63-4a55-b315-506757170fa0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.567412] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d0a9234-1981-4b2a-86f6-9d09850540f0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.581367] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfbd6e8c-0bcc-4a74-a4c0-0ae09be3451b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.587448] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f50df2d0-df34-40ae-814c-a70b89706560 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 786.616365] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181238MB free_disk=170GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 786.616496] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 786.616682] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 786.732202] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 786.732375] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4075452d-d1ef-4fb7-8fa1-50ef80998151 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 786.732503] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 6feee415-28ca-42b4-bd0a-ea5e531b117c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 786.732625] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance d31427c1-9979-4617-b5a1-43aee722d88d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 786.732746] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 93e239f1-44f6-4dfa-8634-50b933aaf9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 786.732882] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance c2f5545d-884a-4166-a93b-810ef311c2e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 786.733031] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance feea4bca-d134-475f-81b9-c8415bacf1f1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 786.733126] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 3c235411-c50f-40b5-a681-ca42b7838506 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 786.733241] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance acae2ecc-9a00-4356-96d7-a7521ea46f32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 786.733379] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance cf672665-36c7-4251-a32a-537b9d4c38ed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 786.744929] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance d3d59ff4-eaa9-46b3-8279-50e5cfe740a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.755480] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 6ed2d21c-66d8-4549-b58e-0cbdcb518f48 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.764509] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 15f9e508-12f2-42db-b6d1-d9b154b94da3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.773174] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 165f09fe-9785-46dd-9016-53fc7838fc14 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.781348] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 1fd581d1-9f01-4428-9aac-edc1237e0541 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.789745] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance b086c8ed-73fb-4083-872f-f2d90b0e640f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.797941] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4f8eb28d-4e0a-4dbb-a965-f578de1f5f03 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.805901] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 8e4f0852-9a7b-48d6-ad8d-42df7445798f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.814264] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 8012e45c-a015-40c6-b45c-86f9cd5fe806 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.823185] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 03203308-bbd5-4adf-80a3-e851b9341f62 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.831617] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 203e8cdb-621d-461a-97ba-3e3782f04d1d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 786.831840] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 786.831987] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 786.847288] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Refreshing inventories for resource provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 786.861112] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Updating ProviderTree inventory for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 786.861290] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Updating inventory in ProviderTree for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 786.870600] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Refreshing aggregate associations for resource provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975, aggregates: None {{(pid=59382) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 786.886167] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Refreshing trait associations for resource provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE {{(pid=59382) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 787.129655] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc18b782-5bb6-42b8-9ef4-38506801eca0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.137502] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2592cd24-4918-4c8b-9fe7-751535c47945 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.167719] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5ef9869-89f0-4b9a-99ff-7a9fe967a51e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.175098] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f04b51d7-867f-47d5-9751-5115658e38cb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 787.189385] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 787.197605] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 787.210353] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 787.210545] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.594s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 788.189597] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 788.189926] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 788.190114] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 788.190271] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59382) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10477}} [ 822.437821] env[59382]: DEBUG nova.compute.manager [req-2b9c7952-6750-4358-9692-d149f13c55fc req-d488585c-7899-4d46-8e7d-4c8405a2bfe7 service nova] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Received event network-vif-deleted-be25ccbc-e158-419e-a30d-fc08cdc894e1 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 825.120632] env[59382]: DEBUG nova.compute.manager [req-07dedeb6-8d43-4b63-8a4b-0ad0ff327ea6 req-e9b83a0c-90a0-4a10-8e32-ead93bd9eb3a service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Received event network-vif-deleted-7b623726-f68f-48cf-b049-c64d5bc7aa64 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 825.204832] env[59382]: WARNING oslo_vmware.rw_handles [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 825.204832] env[59382]: ERROR oslo_vmware.rw_handles [ 825.205294] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/0022ed94-b0c7-4e8d-94a1-6e1f0a9149c0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 825.207639] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 825.207883] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Copying Virtual Disk [datastore1] vmware_temp/0022ed94-b0c7-4e8d-94a1-6e1f0a9149c0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/0022ed94-b0c7-4e8d-94a1-6e1f0a9149c0/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 825.208197] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-42f45047-bcf3-4d89-b942-5c5c0d11d7a4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.216546] env[59382]: DEBUG oslo_vmware.api [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Waiting for the task: (returnval){ [ 825.216546] env[59382]: value = "task-2256737" [ 825.216546] env[59382]: _type = "Task" [ 825.216546] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 825.225305] env[59382]: DEBUG oslo_vmware.api [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Task: {'id': task-2256737, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 825.728032] env[59382]: DEBUG oslo_vmware.exceptions [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 825.728297] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 825.728897] env[59382]: ERROR nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 825.728897] env[59382]: Faults: ['InvalidArgument'] [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Traceback (most recent call last): [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] yield resources [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] self.driver.spawn(context, instance, image_meta, [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] self._fetch_image_if_missing(context, vi) [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] image_cache(vi, tmp_image_ds_loc) [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] vm_util.copy_virtual_disk( [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] session._wait_for_task(vmdk_copy_task) [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] return self.wait_for_task(task_ref) [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] return evt.wait() [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] result = hub.switch() [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] return self.greenlet.switch() [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] self.f(*self.args, **self.kw) [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] raise exceptions.translate_fault(task_info.error) [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Faults: ['InvalidArgument'] [ 825.728897] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] [ 825.730038] env[59382]: INFO nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Terminating instance [ 825.731368] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 825.731584] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 825.732444] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 825.732633] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 825.732862] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d9c1a57e-1d5e-4826-8400-e78ed05bbaf8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.735686] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5142df2c-b470-4cd6-8eb9-9bd61ec0f61d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.745115] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 825.746653] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0b0dce19-23e4-4cd1-9233-4a552ef4bbce {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.748364] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 825.748508] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 825.749834] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dbed54b4-86c6-45b8-9f78-8996100a7e80 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.755108] env[59382]: DEBUG oslo_vmware.api [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Waiting for the task: (returnval){ [ 825.755108] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52676cd9-870b-87e5-9dd8-23faf190d211" [ 825.755108] env[59382]: _type = "Task" [ 825.755108] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 825.763241] env[59382]: DEBUG oslo_vmware.api [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52676cd9-870b-87e5-9dd8-23faf190d211, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 825.825528] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 825.825900] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 825.825900] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Deleting the datastore file [datastore1] ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 825.826173] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-26a57d93-416f-4aec-a510-8a316fbb2f39 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 825.835329] env[59382]: DEBUG oslo_vmware.api [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Waiting for the task: (returnval){ [ 825.835329] env[59382]: value = "task-2256739" [ 825.835329] env[59382]: _type = "Task" [ 825.835329] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 825.846291] env[59382]: DEBUG oslo_vmware.api [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Task: {'id': task-2256739, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 826.274716] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 826.275012] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Creating directory with path [datastore1] vmware_temp/a604e696-6b24-4c5c-b220-207fb5a94f1a/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 826.275224] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-904052a6-d071-43e7-a342-a7647b0a3ac3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.292392] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Created directory with path [datastore1] vmware_temp/a604e696-6b24-4c5c-b220-207fb5a94f1a/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 826.292592] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Fetch image to [datastore1] vmware_temp/a604e696-6b24-4c5c-b220-207fb5a94f1a/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 826.292892] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/a604e696-6b24-4c5c-b220-207fb5a94f1a/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 826.293513] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a9ae37e-3bd8-4032-a500-0741d811f11a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.300576] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-855baa87-abb1-4689-9866-9cca7d4bbef4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.309836] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34c96ffb-1e68-466e-bfb5-4a590ee5e762 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.354845] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4d57f3f-6e73-442c-90c3-f8a51ebd4a88 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.364386] env[59382]: DEBUG oslo_vmware.api [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Task: {'id': task-2256739, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.088201} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 826.364599] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c1a2eb63-0e61-43ca-b4db-0fb4a36a3086 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.366451] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 826.366567] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 826.370024] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 826.370024] env[59382]: INFO nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Took 0.63 seconds to destroy the instance on the hypervisor. [ 826.371053] env[59382]: DEBUG nova.compute.claims [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 826.371223] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 826.371431] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 826.387595] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 826.445130] env[59382]: DEBUG oslo_vmware.rw_handles [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a604e696-6b24-4c5c-b220-207fb5a94f1a/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 826.507891] env[59382]: DEBUG oslo_vmware.rw_handles [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 826.508073] env[59382]: DEBUG oslo_vmware.rw_handles [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a604e696-6b24-4c5c-b220-207fb5a94f1a/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 826.753569] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85eb9a9e-b5e2-4250-aca4-cf6d8db0f740 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.763324] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-643152bc-14c9-4cdf-ba6d-a7e3fcd3e4ee {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.795288] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b24ff19-37c1-411a-9fd8-00edbdfbf9cc {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.806019] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5309c6a-5d4f-4e84-bf96-85013ac7b5ba {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.818278] env[59382]: DEBUG nova.compute.provider_tree [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 826.834827] env[59382]: DEBUG nova.scheduler.client.report [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 826.853480] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.482s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 826.854045] env[59382]: ERROR nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 826.854045] env[59382]: Faults: ['InvalidArgument'] [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Traceback (most recent call last): [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] self.driver.spawn(context, instance, image_meta, [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] self._fetch_image_if_missing(context, vi) [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] image_cache(vi, tmp_image_ds_loc) [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] vm_util.copy_virtual_disk( [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] session._wait_for_task(vmdk_copy_task) [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] return self.wait_for_task(task_ref) [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] return evt.wait() [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] result = hub.switch() [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] return self.greenlet.switch() [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] self.f(*self.args, **self.kw) [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] raise exceptions.translate_fault(task_info.error) [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Faults: ['InvalidArgument'] [ 826.854045] env[59382]: ERROR nova.compute.manager [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] [ 826.855069] env[59382]: DEBUG nova.compute.utils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 826.856511] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Build of instance ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c was re-scheduled: A specified parameter was not correct: fileType [ 826.856511] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 826.856798] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 826.856965] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 826.857453] env[59382]: DEBUG nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 826.857453] env[59382]: DEBUG nova.network.neutron [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 827.190968] env[59382]: DEBUG nova.compute.manager [req-c37c22bc-6080-46ce-9ab2-b58de8452023 req-908383b1-26d5-4b08-b2f9-def310de74be service nova] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Received event network-vif-deleted-3a320330-cadf-446a-b3cd-56cdc7a3b9ce {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 827.191220] env[59382]: DEBUG nova.compute.manager [req-c37c22bc-6080-46ce-9ab2-b58de8452023 req-908383b1-26d5-4b08-b2f9-def310de74be service nova] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Received event network-vif-deleted-14fa78ed-8777-450d-ac25-47967333f524 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 827.305590] env[59382]: DEBUG nova.network.neutron [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 827.317023] env[59382]: INFO nova.compute.manager [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Took 0.46 seconds to deallocate network for instance. [ 827.441260] env[59382]: INFO nova.scheduler.client.report [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Deleted allocations for instance ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c [ 827.459914] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ffb9bf72-eb82-414c-a3a9-38c57866939d tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 290.682s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.461053] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 289.664s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.461254] env[59382]: INFO nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] During sync_power_state the instance has a pending task (spawning). Skip. [ 827.461437] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.462287] env[59382]: DEBUG oslo_concurrency.lockutils [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 91.184s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.462550] env[59382]: DEBUG oslo_concurrency.lockutils [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Acquiring lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.462762] env[59382]: DEBUG oslo_concurrency.lockutils [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.462922] env[59382]: DEBUG oslo_concurrency.lockutils [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.464658] env[59382]: INFO nova.compute.manager [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Terminating instance [ 827.466370] env[59382]: DEBUG nova.compute.manager [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 827.466563] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 827.466821] env[59382]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-95443f4e-fd31-4cc4-9d50-0b4a55c7b27b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.476151] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bff9bb1-ea33-4d4a-b268-6d43d08b6a93 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.487469] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 827.508348] env[59382]: WARNING nova.virt.vmwareapi.vmops [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c could not be found. [ 827.508902] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 827.508902] env[59382]: INFO nova.compute.manager [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 827.509112] env[59382]: DEBUG oslo.service.loopingcall [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 827.509215] env[59382]: DEBUG nova.compute.manager [-] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 827.509426] env[59382]: DEBUG nova.network.neutron [-] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 827.546959] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 827.546959] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 827.548874] env[59382]: INFO nova.compute.claims [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 827.555263] env[59382]: DEBUG nova.network.neutron [-] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 827.564590] env[59382]: INFO nova.compute.manager [-] [instance: ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c] Took 0.05 seconds to deallocate network for instance. [ 827.714161] env[59382]: DEBUG oslo_concurrency.lockutils [None req-50c46884-4341-424e-801c-d5418269962f tempest-ServerExternalEventsTest-345661173 tempest-ServerExternalEventsTest-345661173-project-member] Lock "ac88dde8-ccf9-48a4-bcb9-1c7fb0670d2c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.251s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.900889] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-857e7524-fd8b-42c0-a596-b2d67e5c6260 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.909426] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ef25e8b-c1ee-4139-8cb7-a2b72c8cdd7f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.942412] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-959824f6-4a6f-4e53-9da3-aab5ce045457 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.951466] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30f8a9ea-5410-4695-ae72-196769841d85 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 827.965779] env[59382]: DEBUG nova.compute.provider_tree [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 827.984133] env[59382]: DEBUG nova.scheduler.client.report [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 828.000028] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.455s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 828.000554] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 828.044533] env[59382]: DEBUG nova.compute.utils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 828.047014] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 828.047014] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 828.063113] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 828.157811] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 828.187652] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 828.188144] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 828.188896] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 828.189199] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 828.189404] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 828.189678] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 828.189953] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 828.190179] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 828.190512] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 828.190727] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 828.190947] env[59382]: DEBUG nova.virt.hardware [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 828.191856] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20d2d69b-e70d-44f1-a495-a00b3d411284 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.202077] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26b1741b-ec07-40dc-a7fd-f56f88e51dc1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 828.433188] env[59382]: DEBUG nova.policy [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8de534386cf421ba7afa7eeb201c073', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a9a558b318b04e2d84893ce991b745cb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 829.500306] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Successfully created port: 2f523966-a255-4975-b227-2b680059576d {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 829.678117] env[59382]: DEBUG nova.compute.manager [req-3bfb9b96-0789-41ec-bc88-652e4b7b8f7c req-4c70b671-d907-47e0-80f6-c47a1e041eb7 service nova] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Received event network-vif-deleted-b2a7649c-f403-42fb-9299-2637927d4fc9 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 830.707679] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Successfully updated port: 2f523966-a255-4975-b227-2b680059576d {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 830.719703] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 830.721430] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquired lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 830.721430] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 830.761677] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 830.761895] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 830.772852] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Releasing lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 830.772852] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance network_info: |[]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 830.773744] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance VIF info [] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 830.778943] env[59382]: DEBUG oslo.service.loopingcall [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 830.780348] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 830.780348] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6764c8e2-f96b-4ac5-a0ca-993f9ebee8ed {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 830.801805] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 830.801805] env[59382]: value = "task-2256740" [ 830.801805] env[59382]: _type = "Task" [ 830.801805] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 830.812942] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256740, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 831.314027] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256740, 'name': CreateVM_Task, 'duration_secs': 0.291728} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 831.314027] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 831.314027] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 831.314027] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 831.314027] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 831.314027] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d3c37e2b-4146-412b-abdd-facd7183df17 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.321064] env[59382]: DEBUG oslo_vmware.api [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Waiting for the task: (returnval){ [ 831.321064] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]528c61e6-01eb-373e-7c62-18302649d6d1" [ 831.321064] env[59382]: _type = "Task" [ 831.321064] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 831.330739] env[59382]: DEBUG oslo_vmware.api [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]528c61e6-01eb-373e-7c62-18302649d6d1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 831.834623] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 831.834980] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 831.835375] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 832.173880] env[59382]: DEBUG nova.compute.manager [req-6d1f903c-fe40-4082-a50b-1aa4cefd96b2 req-100bdd77-f9f9-47c9-a1fe-b0f0eeec19a5 service nova] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Received event network-changed-2f523966-a255-4975-b227-2b680059576d {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 832.174078] env[59382]: DEBUG nova.compute.manager [req-6d1f903c-fe40-4082-a50b-1aa4cefd96b2 req-100bdd77-f9f9-47c9-a1fe-b0f0eeec19a5 service nova] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Refreshing instance network info cache due to event network-changed-2f523966-a255-4975-b227-2b680059576d. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 832.174398] env[59382]: DEBUG oslo_concurrency.lockutils [req-6d1f903c-fe40-4082-a50b-1aa4cefd96b2 req-100bdd77-f9f9-47c9-a1fe-b0f0eeec19a5 service nova] Acquiring lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 832.174488] env[59382]: DEBUG oslo_concurrency.lockutils [req-6d1f903c-fe40-4082-a50b-1aa4cefd96b2 req-100bdd77-f9f9-47c9-a1fe-b0f0eeec19a5 service nova] Acquired lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 832.174714] env[59382]: DEBUG nova.network.neutron [req-6d1f903c-fe40-4082-a50b-1aa4cefd96b2 req-100bdd77-f9f9-47c9-a1fe-b0f0eeec19a5 service nova] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Refreshing network info cache for port 2f523966-a255-4975-b227-2b680059576d {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 832.280553] env[59382]: DEBUG nova.network.neutron [req-6d1f903c-fe40-4082-a50b-1aa4cefd96b2 req-100bdd77-f9f9-47c9-a1fe-b0f0eeec19a5 service nova] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 832.453602] env[59382]: DEBUG nova.network.neutron [req-6d1f903c-fe40-4082-a50b-1aa4cefd96b2 req-100bdd77-f9f9-47c9-a1fe-b0f0eeec19a5 service nova] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance is deleted, no further info cache update {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:106}} [ 832.453806] env[59382]: DEBUG oslo_concurrency.lockutils [req-6d1f903c-fe40-4082-a50b-1aa4cefd96b2 req-100bdd77-f9f9-47c9-a1fe-b0f0eeec19a5 service nova] Releasing lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 832.454075] env[59382]: DEBUG nova.compute.manager [req-6d1f903c-fe40-4082-a50b-1aa4cefd96b2 req-100bdd77-f9f9-47c9-a1fe-b0f0eeec19a5 service nova] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Received event network-vif-deleted-d00ded62-3944-4584-bf32-4c4f75d397ee {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 844.526934] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 844.527378] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 845.522717] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 846.527073] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 847.522852] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 847.547513] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 847.547939] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Starting heal instance info cache {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9858}} [ 847.548283] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Rebuilding the list of instances to heal {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9862}} [ 847.565017] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 847.565017] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 847.565017] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 847.565017] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 847.565017] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Didn't find any instances for network info cache update. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9944}} [ 847.565017] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 847.565017] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59382) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10477}} [ 847.565017] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 847.578201] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 847.578201] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 847.578201] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 847.578201] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 847.578542] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94664290-e944-4c91-8972-ee6464ee3305 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.588983] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43b70eb9-b674-4d65-ba67-0733e23179a9 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.613829] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b82cb9a-ad38-4d63-ad06-71ad7b0dcf5f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.624210] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0683d1e0-ae12-4b8b-82d7-42f8ddf97358 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.657922] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181233MB free_disk=169GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 847.657922] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 847.657922] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 847.721603] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 4075452d-d1ef-4fb7-8fa1-50ef80998151 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 847.721603] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 6feee415-28ca-42b4-bd0a-ea5e531b117c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 847.721603] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance d31427c1-9979-4617-b5a1-43aee722d88d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 847.721603] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 93e239f1-44f6-4dfa-8634-50b933aaf9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 847.736514] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 8012e45c-a015-40c6-b45c-86f9cd5fe806 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 847.747535] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 03203308-bbd5-4adf-80a3-e851b9341f62 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 847.764644] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 203e8cdb-621d-461a-97ba-3e3782f04d1d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 847.764644] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 847.764644] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 847.864158] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3a08639-1767-4ecc-8147-cc3c1b2de981 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.870941] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15ade895-424c-46a2-85ed-790408a5877c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.903100] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06c53e0b-036d-493d-843f-c482dc297817 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.910627] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48f67752-84d1-4362-ad97-1bd0b63c8bcc {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 847.923869] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Updating inventory in ProviderTree for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 169, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 847.956915] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Updated inventory for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with generation 32 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 169, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 847.957171] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Updating resource provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 generation from 32 to 33 during operation: update_inventory {{(pid=59382) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 847.957327] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Updating inventory in ProviderTree for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 169, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 847.973607] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 847.973801] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.316s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 848.936393] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 849.527821] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 853.053735] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Acquiring lock "390366c5-ced3-4ac9-9687-c5d2895fbc1a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 853.054078] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Lock "390366c5-ced3-4ac9-9687-c5d2895fbc1a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 856.032361] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "2dfb7e00-fea7-4186-914a-98e1e5fbe49a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 856.032361] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Lock "2dfb7e00-fea7-4186-914a-98e1e5fbe49a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 856.330251] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "81f08c14-ee4b-4954-bf53-dc02bb600279" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 856.330366] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Lock "81f08c14-ee4b-4954-bf53-dc02bb600279" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 856.699195] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "05e46e58-1de8-48a0-a139-c202d77e85ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 856.699419] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Lock "05e46e58-1de8-48a0-a139-c202d77e85ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 862.188788] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Acquiring lock "55c244ec-daa2-4eef-8de3-324d0815026b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 862.189092] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lock "55c244ec-daa2-4eef-8de3-324d0815026b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.786524] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquiring lock "e57c71dc-fdb2-4861-b716-c6caebd6c29e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.786867] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Lock "e57c71dc-fdb2-4861-b716-c6caebd6c29e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.810813] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquiring lock "5fd51316-ab8f-4501-8389-de12a294f8da" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.811087] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Lock "5fd51316-ab8f-4501-8389-de12a294f8da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 875.226083] env[59382]: WARNING oslo_vmware.rw_handles [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 875.226083] env[59382]: ERROR oslo_vmware.rw_handles [ 875.226934] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/a604e696-6b24-4c5c-b220-207fb5a94f1a/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 875.228286] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 875.228537] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Copying Virtual Disk [datastore1] vmware_temp/a604e696-6b24-4c5c-b220-207fb5a94f1a/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/a604e696-6b24-4c5c-b220-207fb5a94f1a/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 875.228827] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7dbe4625-9400-44b2-abb8-6ffab9576d64 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.237185] env[59382]: DEBUG oslo_vmware.api [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Waiting for the task: (returnval){ [ 875.237185] env[59382]: value = "task-2256751" [ 875.237185] env[59382]: _type = "Task" [ 875.237185] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 875.244641] env[59382]: DEBUG oslo_vmware.api [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Task: {'id': task-2256751, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 875.747665] env[59382]: DEBUG oslo_vmware.exceptions [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 875.747920] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 875.748519] env[59382]: ERROR nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 875.748519] env[59382]: Faults: ['InvalidArgument'] [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Traceback (most recent call last): [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] yield resources [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] self.driver.spawn(context, instance, image_meta, [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] self._vmops.spawn(context, instance, image_meta, injected_files, [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] self._fetch_image_if_missing(context, vi) [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] image_cache(vi, tmp_image_ds_loc) [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] vm_util.copy_virtual_disk( [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] session._wait_for_task(vmdk_copy_task) [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] return self.wait_for_task(task_ref) [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] return evt.wait() [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] result = hub.switch() [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] return self.greenlet.switch() [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] self.f(*self.args, **self.kw) [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] raise exceptions.translate_fault(task_info.error) [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Faults: ['InvalidArgument'] [ 875.748519] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] [ 875.749498] env[59382]: INFO nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Terminating instance [ 875.750662] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 875.750788] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 875.751345] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 875.751561] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 875.751786] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-065fb189-5919-4461-92ea-9f5b2c77ed4a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.754429] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2308d35a-fd69-49db-b979-a2626bc69ca7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.761196] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 875.761413] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8290e715-db9f-4a4c-ba81-df3719f6a4cf {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.763711] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 875.763885] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 875.764825] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aa0faff5-bb02-4b5a-b8bc-4261e9c24450 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.770897] env[59382]: DEBUG oslo_vmware.api [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Waiting for the task: (returnval){ [ 875.770897] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52c1c764-11a9-978b-5dcf-0294680043d5" [ 875.770897] env[59382]: _type = "Task" [ 875.770897] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 875.777760] env[59382]: DEBUG oslo_vmware.api [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52c1c764-11a9-978b-5dcf-0294680043d5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 875.833182] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 875.833398] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 875.833573] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Deleting the datastore file [datastore1] 4075452d-d1ef-4fb7-8fa1-50ef80998151 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 875.833839] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a4f1845b-32e6-4438-8713-768570d8ea4f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 875.841124] env[59382]: DEBUG oslo_vmware.api [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Waiting for the task: (returnval){ [ 875.841124] env[59382]: value = "task-2256753" [ 875.841124] env[59382]: _type = "Task" [ 875.841124] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 875.848749] env[59382]: DEBUG oslo_vmware.api [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Task: {'id': task-2256753, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 876.281517] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 876.281837] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Creating directory with path [datastore1] vmware_temp/5b092d38-74d0-4a42-b2ee-560746f73940/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 876.281986] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d1119059-6aac-41a9-86c7-aea71195dec3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.346609] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Created directory with path [datastore1] vmware_temp/5b092d38-74d0-4a42-b2ee-560746f73940/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 876.346809] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Fetch image to [datastore1] vmware_temp/5b092d38-74d0-4a42-b2ee-560746f73940/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 876.346974] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/5b092d38-74d0-4a42-b2ee-560746f73940/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 876.348049] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6551b81-ffe4-4c2d-ab64-cd21b9319420 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.353168] env[59382]: DEBUG oslo_vmware.api [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Task: {'id': task-2256753, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075801} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 876.353705] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 876.353911] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 876.354117] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 876.354332] env[59382]: INFO nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Took 0.60 seconds to destroy the instance on the hypervisor. [ 876.358015] env[59382]: DEBUG nova.compute.claims [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 876.358191] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 876.358399] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 876.361267] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f660a080-a4ea-4c4c-b562-e5d67cbd7d7a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.370494] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-705f4396-271a-444b-b30a-e4ef65790b48 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.400937] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-619be550-1acf-434e-b850-bad535c1e935 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.409944] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fd7361aa-77d3-4d64-854e-5259d65dfd3b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.429624] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 876.479861] env[59382]: DEBUG oslo_vmware.rw_handles [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5b092d38-74d0-4a42-b2ee-560746f73940/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 876.537837] env[59382]: DEBUG oslo_vmware.rw_handles [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 876.539650] env[59382]: DEBUG oslo_vmware.rw_handles [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5b092d38-74d0-4a42-b2ee-560746f73940/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 876.606728] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a7d4987-300a-42f2-8271-78a4eb0845f5 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.615170] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-820a5729-e9d0-4402-9e7a-587fb7ca9c31 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.643498] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f521a594-83fb-422a-965e-a1cc9e02771b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.649885] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87ac74f9-9c73-478e-abd5-c489fd86a8ff {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.662385] env[59382]: DEBUG nova.compute.provider_tree [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Updating inventory in ProviderTree for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 876.685699] env[59382]: ERROR nova.scheduler.client.report [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [req-6b75f15b-cff4-48ba-88f0-a860ff08c5d7] Failed to update inventory to [{'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}}] for resource provider with UUID 0ed62ac0-b25e-450c-a6ea-1ad3f7977975. Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict ", "code": "placement.concurrent_update", "request_id": "req-6b75f15b-cff4-48ba-88f0-a860ff08c5d7"}]}: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 876.701181] env[59382]: DEBUG nova.scheduler.client.report [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Refreshing inventories for resource provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 876.712671] env[59382]: DEBUG nova.scheduler.client.report [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Updating ProviderTree inventory for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 169, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 876.712883] env[59382]: DEBUG nova.compute.provider_tree [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Updating inventory in ProviderTree for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 169, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 876.722610] env[59382]: DEBUG nova.scheduler.client.report [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Refreshing aggregate associations for resource provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975, aggregates: None {{(pid=59382) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 876.736901] env[59382]: DEBUG nova.scheduler.client.report [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Refreshing trait associations for resource provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE {{(pid=59382) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 876.867628] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a43a6a57-07e7-4adf-8854-dd952647544c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.875141] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-039e9c0f-d604-4678-a316-86cd2fcc0f6e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.905341] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f029837-026e-4c71-94bf-dc4dd72fa2c5 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.912100] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d52223ed-20ca-47a9-ad35-534df191aa57 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 876.926031] env[59382]: DEBUG nova.compute.provider_tree [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Updating inventory in ProviderTree for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 876.957627] env[59382]: DEBUG nova.scheduler.client.report [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Updated inventory for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with generation 40 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 876.957958] env[59382]: DEBUG nova.compute.provider_tree [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Updating resource provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 generation from 40 to 41 during operation: update_inventory {{(pid=59382) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 876.958200] env[59382]: DEBUG nova.compute.provider_tree [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Updating inventory in ProviderTree for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 876.970851] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.612s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 876.971518] env[59382]: ERROR nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 876.971518] env[59382]: Faults: ['InvalidArgument'] [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Traceback (most recent call last): [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] self.driver.spawn(context, instance, image_meta, [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] self._vmops.spawn(context, instance, image_meta, injected_files, [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] self._fetch_image_if_missing(context, vi) [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] image_cache(vi, tmp_image_ds_loc) [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] vm_util.copy_virtual_disk( [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] session._wait_for_task(vmdk_copy_task) [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] return self.wait_for_task(task_ref) [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] return evt.wait() [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] result = hub.switch() [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] return self.greenlet.switch() [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] self.f(*self.args, **self.kw) [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] raise exceptions.translate_fault(task_info.error) [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Faults: ['InvalidArgument'] [ 876.971518] env[59382]: ERROR nova.compute.manager [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] [ 876.972728] env[59382]: DEBUG nova.compute.utils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 876.973622] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Build of instance 4075452d-d1ef-4fb7-8fa1-50ef80998151 was re-scheduled: A specified parameter was not correct: fileType [ 876.973622] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 876.973990] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 876.974182] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 876.974337] env[59382]: DEBUG nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 876.974501] env[59382]: DEBUG nova.network.neutron [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 877.231162] env[59382]: DEBUG nova.network.neutron [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 877.240776] env[59382]: INFO nova.compute.manager [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Took 0.27 seconds to deallocate network for instance. [ 877.333732] env[59382]: INFO nova.scheduler.client.report [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Deleted allocations for instance 4075452d-d1ef-4fb7-8fa1-50ef80998151 [ 877.362048] env[59382]: DEBUG oslo_concurrency.lockutils [None req-01b8ea8c-8e66-4342-b762-eb81bab34959 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "4075452d-d1ef-4fb7-8fa1-50ef80998151" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 339.175s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.364128] env[59382]: DEBUG oslo_concurrency.lockutils [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "4075452d-d1ef-4fb7-8fa1-50ef80998151" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 139.061s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 877.364128] env[59382]: DEBUG oslo_concurrency.lockutils [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "4075452d-d1ef-4fb7-8fa1-50ef80998151-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 877.364128] env[59382]: DEBUG oslo_concurrency.lockutils [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "4075452d-d1ef-4fb7-8fa1-50ef80998151-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 877.364128] env[59382]: DEBUG oslo_concurrency.lockutils [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "4075452d-d1ef-4fb7-8fa1-50ef80998151-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.365949] env[59382]: INFO nova.compute.manager [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Terminating instance [ 877.368115] env[59382]: DEBUG nova.compute.manager [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 877.368325] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 877.368766] env[59382]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e05eb350-ff4f-4c05-b05d-df6d5e5bf6d8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.373904] env[59382]: DEBUG nova.compute.manager [None req-dd343cee-b907-404f-9f42-6fa531c5778a tempest-VolumesAssistedSnapshotsTest-800723085 tempest-VolumesAssistedSnapshotsTest-800723085-project-member] [instance: 6ed2d21c-66d8-4549-b58e-0cbdcb518f48] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 877.380202] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1025355-a25a-48af-98f1-5a523f25017e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 877.398402] env[59382]: DEBUG nova.compute.manager [None req-dd343cee-b907-404f-9f42-6fa531c5778a tempest-VolumesAssistedSnapshotsTest-800723085 tempest-VolumesAssistedSnapshotsTest-800723085-project-member] [instance: 6ed2d21c-66d8-4549-b58e-0cbdcb518f48] Instance disappeared before build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 877.409020] env[59382]: WARNING nova.virt.vmwareapi.vmops [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4075452d-d1ef-4fb7-8fa1-50ef80998151 could not be found. [ 877.409020] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 877.409020] env[59382]: INFO nova.compute.manager [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Took 0.04 seconds to destroy the instance on the hypervisor. [ 877.409020] env[59382]: DEBUG oslo.service.loopingcall [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 877.409274] env[59382]: DEBUG nova.compute.manager [-] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 877.409371] env[59382]: DEBUG nova.network.neutron [-] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 877.425455] env[59382]: DEBUG oslo_concurrency.lockutils [None req-dd343cee-b907-404f-9f42-6fa531c5778a tempest-VolumesAssistedSnapshotsTest-800723085 tempest-VolumesAssistedSnapshotsTest-800723085-project-member] Lock "6ed2d21c-66d8-4549-b58e-0cbdcb518f48" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 243.885s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.437492] env[59382]: DEBUG nova.compute.manager [None req-c83e5600-8294-491e-8990-3082774618d8 tempest-ServersTestJSON-2096502770 tempest-ServersTestJSON-2096502770-project-member] [instance: 15f9e508-12f2-42db-b6d1-d9b154b94da3] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 877.441446] env[59382]: DEBUG nova.network.neutron [-] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 877.448636] env[59382]: INFO nova.compute.manager [-] [instance: 4075452d-d1ef-4fb7-8fa1-50ef80998151] Took 0.04 seconds to deallocate network for instance. [ 877.466903] env[59382]: DEBUG nova.compute.manager [None req-c83e5600-8294-491e-8990-3082774618d8 tempest-ServersTestJSON-2096502770 tempest-ServersTestJSON-2096502770-project-member] [instance: 15f9e508-12f2-42db-b6d1-d9b154b94da3] Instance disappeared before build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 877.489041] env[59382]: DEBUG oslo_concurrency.lockutils [None req-c83e5600-8294-491e-8990-3082774618d8 tempest-ServersTestJSON-2096502770 tempest-ServersTestJSON-2096502770-project-member] Lock "15f9e508-12f2-42db-b6d1-d9b154b94da3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.063s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.497270] env[59382]: DEBUG nova.compute.manager [None req-3229f2a1-10f2-4096-ba43-2e7220625ab7 tempest-ServerShowV247Test-2056769075 tempest-ServerShowV247Test-2056769075-project-member] [instance: 165f09fe-9785-46dd-9016-53fc7838fc14] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 877.521251] env[59382]: DEBUG nova.compute.manager [None req-3229f2a1-10f2-4096-ba43-2e7220625ab7 tempest-ServerShowV247Test-2056769075 tempest-ServerShowV247Test-2056769075-project-member] [instance: 165f09fe-9785-46dd-9016-53fc7838fc14] Instance disappeared before build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 877.539794] env[59382]: DEBUG oslo_concurrency.lockutils [None req-3229f2a1-10f2-4096-ba43-2e7220625ab7 tempest-ServerShowV247Test-2056769075 tempest-ServerShowV247Test-2056769075-project-member] Lock "165f09fe-9785-46dd-9016-53fc7838fc14" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.408s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.547161] env[59382]: DEBUG oslo_concurrency.lockutils [None req-d808e7a5-cd1e-47d8-a3de-31c052bd2390 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "4075452d-d1ef-4fb7-8fa1-50ef80998151" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.184s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.548445] env[59382]: DEBUG nova.compute.manager [None req-854a5e8f-afd7-458d-9bd2-e041a21ed60c tempest-ServerShowV247Test-2056769075 tempest-ServerShowV247Test-2056769075-project-member] [instance: 1fd581d1-9f01-4428-9aac-edc1237e0541] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 877.570137] env[59382]: DEBUG nova.compute.manager [None req-854a5e8f-afd7-458d-9bd2-e041a21ed60c tempest-ServerShowV247Test-2056769075 tempest-ServerShowV247Test-2056769075-project-member] [instance: 1fd581d1-9f01-4428-9aac-edc1237e0541] Instance disappeared before build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 877.590209] env[59382]: DEBUG oslo_concurrency.lockutils [None req-854a5e8f-afd7-458d-9bd2-e041a21ed60c tempest-ServerShowV247Test-2056769075 tempest-ServerShowV247Test-2056769075-project-member] Lock "1fd581d1-9f01-4428-9aac-edc1237e0541" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.089s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.601048] env[59382]: DEBUG nova.compute.manager [None req-806bd8dd-2820-4cb5-8aa5-1db0c03b2433 tempest-ServerActionsTestOtherB-1111562412 tempest-ServerActionsTestOtherB-1111562412-project-member] [instance: b086c8ed-73fb-4083-872f-f2d90b0e640f] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 877.624911] env[59382]: DEBUG nova.compute.manager [None req-806bd8dd-2820-4cb5-8aa5-1db0c03b2433 tempest-ServerActionsTestOtherB-1111562412 tempest-ServerActionsTestOtherB-1111562412-project-member] [instance: b086c8ed-73fb-4083-872f-f2d90b0e640f] Instance disappeared before build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 877.644454] env[59382]: DEBUG oslo_concurrency.lockutils [None req-806bd8dd-2820-4cb5-8aa5-1db0c03b2433 tempest-ServerActionsTestOtherB-1111562412 tempest-ServerActionsTestOtherB-1111562412-project-member] Lock "b086c8ed-73fb-4083-872f-f2d90b0e640f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.448s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.652566] env[59382]: DEBUG nova.compute.manager [None req-7346418d-7d73-4322-b02b-b2a0bebe2751 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 4f8eb28d-4e0a-4dbb-a965-f578de1f5f03] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 877.673361] env[59382]: DEBUG nova.compute.manager [None req-7346418d-7d73-4322-b02b-b2a0bebe2751 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 4f8eb28d-4e0a-4dbb-a965-f578de1f5f03] Instance disappeared before build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 877.692597] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7346418d-7d73-4322-b02b-b2a0bebe2751 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Lock "4f8eb28d-4e0a-4dbb-a965-f578de1f5f03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.833s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.700642] env[59382]: DEBUG nova.compute.manager [None req-ef8a9254-23fc-4f3f-8927-46ba91da5003 tempest-SecurityGroupsTestJSON-1949994903 tempest-SecurityGroupsTestJSON-1949994903-project-member] [instance: 8e4f0852-9a7b-48d6-ad8d-42df7445798f] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 877.721813] env[59382]: DEBUG nova.compute.manager [None req-ef8a9254-23fc-4f3f-8927-46ba91da5003 tempest-SecurityGroupsTestJSON-1949994903 tempest-SecurityGroupsTestJSON-1949994903-project-member] [instance: 8e4f0852-9a7b-48d6-ad8d-42df7445798f] Instance disappeared before build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 877.741077] env[59382]: DEBUG oslo_concurrency.lockutils [None req-ef8a9254-23fc-4f3f-8927-46ba91da5003 tempest-SecurityGroupsTestJSON-1949994903 tempest-SecurityGroupsTestJSON-1949994903-project-member] Lock "8e4f0852-9a7b-48d6-ad8d-42df7445798f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.461s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.748910] env[59382]: DEBUG nova.compute.manager [None req-a91678a2-35ff-4c2e-8a35-45dcdcf6391c tempest-ServersTestBootFromVolume-490524462 tempest-ServersTestBootFromVolume-490524462-project-member] [instance: 8012e45c-a015-40c6-b45c-86f9cd5fe806] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 877.770435] env[59382]: DEBUG nova.compute.manager [None req-a91678a2-35ff-4c2e-8a35-45dcdcf6391c tempest-ServersTestBootFromVolume-490524462 tempest-ServersTestBootFromVolume-490524462-project-member] [instance: 8012e45c-a015-40c6-b45c-86f9cd5fe806] Instance disappeared before build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 877.789659] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a91678a2-35ff-4c2e-8a35-45dcdcf6391c tempest-ServersTestBootFromVolume-490524462 tempest-ServersTestBootFromVolume-490524462-project-member] Lock "8012e45c-a015-40c6-b45c-86f9cd5fe806" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.965s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 877.797623] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 877.843218] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 877.843526] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 877.845008] env[59382]: INFO nova.compute.claims [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 878.020155] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-849bf1fe-3196-4fad-a356-a2809a9620ed {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.027747] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64523f9d-259c-4a43-9bcc-869291c07a87 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.056566] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85836bc1-249c-4659-aac0-3c132163b3ca {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.063231] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cf7d3a1-8ff3-4a46-8f57-928685725e38 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.075804] env[59382]: DEBUG nova.compute.provider_tree [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 878.085996] env[59382]: DEBUG nova.scheduler.client.report [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 878.102638] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.103192] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 878.133961] env[59382]: DEBUG nova.compute.utils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 878.135025] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 878.135134] env[59382]: DEBUG nova.network.neutron [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 878.143191] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 878.209029] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 878.218954] env[59382]: DEBUG nova.policy [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9702a8c287ce4b3a9e48669e01398a12', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '28c98dc82546468791584d1f12a9ae5a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 878.231231] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 878.231490] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 878.231663] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 878.231849] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 878.231994] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 878.232156] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 878.232360] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 878.232516] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 878.232681] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 878.232923] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 878.233114] env[59382]: DEBUG nova.virt.hardware [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 878.233942] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25fd46d8-a997-4433-8d9f-39df0f9c9318 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.241878] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09d37598-45fd-445f-9ea0-9aa00e654c27 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.637166] env[59382]: DEBUG nova.network.neutron [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Successfully created port: ea791d88-34b4-4c82-9b95-77b33bfdd4c3 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 878.646894] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "8ea743ab-33df-4834-b1e7-2ef7f1e1a147" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.646894] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "8ea743ab-33df-4834-b1e7-2ef7f1e1a147" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 879.281130] env[59382]: DEBUG nova.network.neutron [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Successfully updated port: ea791d88-34b4-4c82-9b95-77b33bfdd4c3 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 879.290302] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "refresh_cache-03203308-bbd5-4adf-80a3-e851b9341f62" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 879.290464] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquired lock "refresh_cache-03203308-bbd5-4adf-80a3-e851b9341f62" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 879.290580] env[59382]: DEBUG nova.network.neutron [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 879.318088] env[59382]: DEBUG nova.compute.manager [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Received event network-vif-plugged-ea791d88-34b4-4c82-9b95-77b33bfdd4c3 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 879.318088] env[59382]: DEBUG oslo_concurrency.lockutils [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] Acquiring lock "03203308-bbd5-4adf-80a3-e851b9341f62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 879.318088] env[59382]: DEBUG oslo_concurrency.lockutils [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] Lock "03203308-bbd5-4adf-80a3-e851b9341f62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 879.322147] env[59382]: DEBUG oslo_concurrency.lockutils [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] Lock "03203308-bbd5-4adf-80a3-e851b9341f62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 879.322147] env[59382]: DEBUG nova.compute.manager [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] No waiting events found dispatching network-vif-plugged-ea791d88-34b4-4c82-9b95-77b33bfdd4c3 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 879.322147] env[59382]: WARNING nova.compute.manager [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Received unexpected event network-vif-plugged-ea791d88-34b4-4c82-9b95-77b33bfdd4c3 for instance with vm_state building and task_state spawning. [ 879.322147] env[59382]: DEBUG nova.compute.manager [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Received event network-changed-ea791d88-34b4-4c82-9b95-77b33bfdd4c3 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 879.322147] env[59382]: DEBUG nova.compute.manager [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Refreshing instance network info cache due to event network-changed-ea791d88-34b4-4c82-9b95-77b33bfdd4c3. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 879.322147] env[59382]: DEBUG oslo_concurrency.lockutils [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] Acquiring lock "refresh_cache-03203308-bbd5-4adf-80a3-e851b9341f62" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 879.326870] env[59382]: DEBUG nova.network.neutron [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 879.522472] env[59382]: DEBUG nova.network.neutron [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Updating instance_info_cache with network_info: [{"id": "ea791d88-34b4-4c82-9b95-77b33bfdd4c3", "address": "fa:16:3e:aa:69:f8", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapea791d88-34", "ovs_interfaceid": "ea791d88-34b4-4c82-9b95-77b33bfdd4c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 879.535442] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Releasing lock "refresh_cache-03203308-bbd5-4adf-80a3-e851b9341f62" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 879.535754] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Instance network_info: |[{"id": "ea791d88-34b4-4c82-9b95-77b33bfdd4c3", "address": "fa:16:3e:aa:69:f8", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapea791d88-34", "ovs_interfaceid": "ea791d88-34b4-4c82-9b95-77b33bfdd4c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 879.536040] env[59382]: DEBUG oslo_concurrency.lockutils [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] Acquired lock "refresh_cache-03203308-bbd5-4adf-80a3-e851b9341f62" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 879.536222] env[59382]: DEBUG nova.network.neutron [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Refreshing network info cache for port ea791d88-34b4-4c82-9b95-77b33bfdd4c3 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 879.537244] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:aa:69:f8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '97b68ed7-8461-4345-b064-96a1dde53a86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ea791d88-34b4-4c82-9b95-77b33bfdd4c3', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 879.544696] env[59382]: DEBUG oslo.service.loopingcall [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 879.545504] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 879.548058] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e30e949a-de26-43b3-9d09-7eb3b907cf87 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.568684] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 879.568684] env[59382]: value = "task-2256754" [ 879.568684] env[59382]: _type = "Task" [ 879.568684] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 879.576399] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256754, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 880.045610] env[59382]: DEBUG nova.network.neutron [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Updated VIF entry in instance network info cache for port ea791d88-34b4-4c82-9b95-77b33bfdd4c3. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 880.046210] env[59382]: DEBUG nova.network.neutron [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Updating instance_info_cache with network_info: [{"id": "ea791d88-34b4-4c82-9b95-77b33bfdd4c3", "address": "fa:16:3e:aa:69:f8", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapea791d88-34", "ovs_interfaceid": "ea791d88-34b4-4c82-9b95-77b33bfdd4c3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 880.057305] env[59382]: DEBUG oslo_concurrency.lockutils [req-f2f90cf8-8991-4a1c-b3ff-5680604c14a8 req-d36c6bc6-6450-4a92-8bd9-3a2214692f23 service nova] Releasing lock "refresh_cache-03203308-bbd5-4adf-80a3-e851b9341f62" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 880.078884] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256754, 'name': CreateVM_Task, 'duration_secs': 0.293697} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 880.079068] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 880.079712] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 880.079874] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 880.080214] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 880.080467] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aab4fe75-d6dd-476b-9e14-28e64be4b46e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 880.084923] env[59382]: DEBUG oslo_vmware.api [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Waiting for the task: (returnval){ [ 880.084923] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]521a2761-9936-bc55-2c4f-8e021e0c131a" [ 880.084923] env[59382]: _type = "Task" [ 880.084923] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 880.092298] env[59382]: DEBUG oslo_vmware.api [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]521a2761-9936-bc55-2c4f-8e021e0c131a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 880.595582] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 880.595828] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 880.596056] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 904.527448] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 906.529071] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 906.529449] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 907.522703] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 907.526277] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 907.526442] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Starting heal instance info cache {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9858}} [ 907.526584] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Rebuilding the list of instances to heal {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9862}} [ 907.541753] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 907.542023] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 907.542056] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 907.542165] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 907.542289] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Didn't find any instances for network info cache update. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9944}} [ 907.542735] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 907.553733] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 907.553968] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 907.554146] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.554296] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 907.555307] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4552a4b7-2d15-4c64-b648-11d2215895c3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.564034] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd147ef8-c253-4cce-80db-aba5b31c7811 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.577903] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58028a47-6d53-4bfd-a7a7-20030c4315b1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.583918] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3541bb0-5e84-43d7-92a7-bb7a05ee3cfb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.612216] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181246MB free_disk=170GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 907.612358] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 907.612541] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 907.656440] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 6feee415-28ca-42b4-bd0a-ea5e531b117c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 907.656592] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance d31427c1-9979-4617-b5a1-43aee722d88d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 907.656721] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 93e239f1-44f6-4dfa-8634-50b933aaf9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 907.656844] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 03203308-bbd5-4adf-80a3-e851b9341f62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 907.666458] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 203e8cdb-621d-461a-97ba-3e3782f04d1d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 907.676995] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 390366c5-ced3-4ac9-9687-c5d2895fbc1a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 907.685835] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 2dfb7e00-fea7-4186-914a-98e1e5fbe49a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 907.694420] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 81f08c14-ee4b-4954-bf53-dc02bb600279 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 907.702732] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 05e46e58-1de8-48a0-a139-c202d77e85ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 907.711149] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 55c244ec-daa2-4eef-8de3-324d0815026b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 907.719783] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance e57c71dc-fdb2-4861-b716-c6caebd6c29e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 907.728105] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 5fd51316-ab8f-4501-8389-de12a294f8da has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 907.736532] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 8ea743ab-33df-4834-b1e7-2ef7f1e1a147 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 907.736739] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 907.736885] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 907.883668] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03f487fe-6a4e-490a-a730-09cff4aa8804 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.891626] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af65b1d6-c22b-4bbc-946c-d6ed366d922e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.920559] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a25d1e6-0354-4e8c-afd6-d0c246c388dc {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.927480] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c62b4fce-9fba-4ced-8651-e802b4f14755 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.941054] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 907.948853] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 907.963901] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 907.964107] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.352s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.948538] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 908.949053] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59382) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10477}} [ 910.527877] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 911.527710] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 925.982263] env[59382]: WARNING oslo_vmware.rw_handles [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 925.982263] env[59382]: ERROR oslo_vmware.rw_handles [ 925.983030] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/5b092d38-74d0-4a42-b2ee-560746f73940/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 925.984702] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 925.984916] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Copying Virtual Disk [datastore1] vmware_temp/5b092d38-74d0-4a42-b2ee-560746f73940/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/5b092d38-74d0-4a42-b2ee-560746f73940/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 925.985227] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2c4a7b41-b9f4-4c1e-a28c-b007a95046ff {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 925.993403] env[59382]: DEBUG oslo_vmware.api [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Waiting for the task: (returnval){ [ 925.993403] env[59382]: value = "task-2256755" [ 925.993403] env[59382]: _type = "Task" [ 925.993403] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 926.001353] env[59382]: DEBUG oslo_vmware.api [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Task: {'id': task-2256755, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 926.505144] env[59382]: DEBUG oslo_vmware.exceptions [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 926.505467] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 926.505951] env[59382]: ERROR nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 926.505951] env[59382]: Faults: ['InvalidArgument'] [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Traceback (most recent call last): [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] yield resources [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] self.driver.spawn(context, instance, image_meta, [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] self._fetch_image_if_missing(context, vi) [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] image_cache(vi, tmp_image_ds_loc) [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] vm_util.copy_virtual_disk( [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] session._wait_for_task(vmdk_copy_task) [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] return self.wait_for_task(task_ref) [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] return evt.wait() [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] result = hub.switch() [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] return self.greenlet.switch() [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] self.f(*self.args, **self.kw) [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] raise exceptions.translate_fault(task_info.error) [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Faults: ['InvalidArgument'] [ 926.505951] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] [ 926.506928] env[59382]: INFO nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Terminating instance [ 926.507833] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 926.508061] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 926.508297] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05615df0-e4de-4200-9a90-1c4b6a7232e6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.510471] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 926.510655] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 926.511359] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b04c6e3-dd70-4262-9ba6-ec90c48131d6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.518246] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 926.519186] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-52c554cc-d0ca-483e-8fe3-df106dc57b48 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.520529] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 926.520756] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 926.521416] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6de108ea-356e-423b-aa79-9d5931f22f03 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.526144] env[59382]: DEBUG oslo_vmware.api [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Waiting for the task: (returnval){ [ 926.526144] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]527af682-f5ea-87c0-e641-a0142302c684" [ 926.526144] env[59382]: _type = "Task" [ 926.526144] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 926.533858] env[59382]: DEBUG oslo_vmware.api [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]527af682-f5ea-87c0-e641-a0142302c684, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 926.592570] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 926.592781] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 926.592958] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Deleting the datastore file [datastore1] 6feee415-28ca-42b4-bd0a-ea5e531b117c {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 926.593239] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e17b17ff-e70b-4928-96a9-7599430f2621 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.599387] env[59382]: DEBUG oslo_vmware.api [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Waiting for the task: (returnval){ [ 926.599387] env[59382]: value = "task-2256757" [ 926.599387] env[59382]: _type = "Task" [ 926.599387] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 926.606987] env[59382]: DEBUG oslo_vmware.api [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Task: {'id': task-2256757, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 927.036142] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 927.036413] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Creating directory with path [datastore1] vmware_temp/34969b7a-4744-4e36-b8c5-75d42764a8bf/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 927.036646] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f6a16ab3-5120-4a2a-b1cd-e03898229f0f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.048600] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Created directory with path [datastore1] vmware_temp/34969b7a-4744-4e36-b8c5-75d42764a8bf/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 927.048792] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Fetch image to [datastore1] vmware_temp/34969b7a-4744-4e36-b8c5-75d42764a8bf/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 927.048956] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/34969b7a-4744-4e36-b8c5-75d42764a8bf/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 927.049787] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67d5faf8-7b4f-4495-9b82-aea2bdf7056e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.056321] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06b9e919-03de-4636-8b17-67b472a27958 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.065300] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f8299e8-8852-4fec-8284-f9b62594c369 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.095111] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f1fcd07-1e30-4a19-ba17-7417637cbba4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.103412] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a8c1f9ac-170a-4022-9206-e766746aa86d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.109497] env[59382]: DEBUG oslo_vmware.api [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Task: {'id': task-2256757, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06315} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 927.109774] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 927.109969] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 927.110158] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 927.110410] env[59382]: INFO nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 927.112476] env[59382]: DEBUG nova.compute.claims [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 927.112627] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 927.112835] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 927.127343] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 927.184086] env[59382]: DEBUG oslo_vmware.rw_handles [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/34969b7a-4744-4e36-b8c5-75d42764a8bf/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 927.244465] env[59382]: DEBUG oslo_vmware.rw_handles [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 927.244669] env[59382]: DEBUG oslo_vmware.rw_handles [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/34969b7a-4744-4e36-b8c5-75d42764a8bf/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 927.365579] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-771259f2-fd55-4fde-895a-659f3b67fdc0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.375075] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a6258d4-5774-4cf1-ac4c-bf036e5a5648 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.406737] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b90ef40a-d18e-482a-ab28-b2b8ffdb2b9a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.413783] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3728f058-470e-472b-ba43-79911a492ce8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.426459] env[59382]: DEBUG nova.compute.provider_tree [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 927.435299] env[59382]: DEBUG nova.scheduler.client.report [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 927.448061] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.335s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 927.448586] env[59382]: ERROR nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 927.448586] env[59382]: Faults: ['InvalidArgument'] [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Traceback (most recent call last): [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] self.driver.spawn(context, instance, image_meta, [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] self._fetch_image_if_missing(context, vi) [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] image_cache(vi, tmp_image_ds_loc) [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] vm_util.copy_virtual_disk( [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] session._wait_for_task(vmdk_copy_task) [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] return self.wait_for_task(task_ref) [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] return evt.wait() [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] result = hub.switch() [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] return self.greenlet.switch() [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] self.f(*self.args, **self.kw) [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] raise exceptions.translate_fault(task_info.error) [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Faults: ['InvalidArgument'] [ 927.448586] env[59382]: ERROR nova.compute.manager [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] [ 927.449330] env[59382]: DEBUG nova.compute.utils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 927.450655] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Build of instance 6feee415-28ca-42b4-bd0a-ea5e531b117c was re-scheduled: A specified parameter was not correct: fileType [ 927.450655] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 927.451052] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 927.451227] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 927.451395] env[59382]: DEBUG nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 927.451556] env[59382]: DEBUG nova.network.neutron [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 927.725146] env[59382]: DEBUG nova.network.neutron [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 927.734788] env[59382]: INFO nova.compute.manager [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Took 0.28 seconds to deallocate network for instance. [ 927.820937] env[59382]: INFO nova.scheduler.client.report [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Deleted allocations for instance 6feee415-28ca-42b4-bd0a-ea5e531b117c [ 927.836671] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b2de696a-9c2c-4585-81e1-ca61f6137c18 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Lock "6feee415-28ca-42b4-bd0a-ea5e531b117c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 386.371s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 927.837748] env[59382]: DEBUG oslo_concurrency.lockutils [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Lock "6feee415-28ca-42b4-bd0a-ea5e531b117c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 186.864s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 927.837977] env[59382]: DEBUG oslo_concurrency.lockutils [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Acquiring lock "6feee415-28ca-42b4-bd0a-ea5e531b117c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 927.839316] env[59382]: DEBUG oslo_concurrency.lockutils [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Lock "6feee415-28ca-42b4-bd0a-ea5e531b117c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 927.839316] env[59382]: DEBUG oslo_concurrency.lockutils [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Lock "6feee415-28ca-42b4-bd0a-ea5e531b117c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 927.840614] env[59382]: INFO nova.compute.manager [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Terminating instance [ 927.842251] env[59382]: DEBUG nova.compute.manager [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 927.842471] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 927.842900] env[59382]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a90262bd-1375-4ab3-9f0a-48c1e8ce68b2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.852746] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fedc38f-9eb3-400c-ac77-cbb1eab6a3c8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.865592] env[59382]: DEBUG nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 927.885645] env[59382]: WARNING nova.virt.vmwareapi.vmops [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6feee415-28ca-42b4-bd0a-ea5e531b117c could not be found. [ 927.885862] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 927.886065] env[59382]: INFO nova.compute.manager [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 927.886310] env[59382]: DEBUG oslo.service.loopingcall [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 927.886538] env[59382]: DEBUG nova.compute.manager [-] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 927.886636] env[59382]: DEBUG nova.network.neutron [-] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 927.915891] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 927.916149] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 927.917956] env[59382]: INFO nova.compute.claims [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 927.922241] env[59382]: DEBUG nova.network.neutron [-] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 927.928783] env[59382]: INFO nova.compute.manager [-] [instance: 6feee415-28ca-42b4-bd0a-ea5e531b117c] Took 0.04 seconds to deallocate network for instance. [ 928.017120] env[59382]: DEBUG oslo_concurrency.lockutils [None req-053da17d-a0f1-4b72-afee-c194be5d0237 tempest-ImagesOneServerNegativeTestJSON-995154376 tempest-ImagesOneServerNegativeTestJSON-995154376-project-member] Lock "6feee415-28ca-42b4-bd0a-ea5e531b117c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.105076] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6e2d72f-872b-4c3d-a1af-567287682ab7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.112603] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-357aa85a-3cd0-47b3-803c-8058dc0bedec {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.143649] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ac42e9c-305d-484f-9027-608c0218d53c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.151474] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1548b6a8-703b-449d-8c92-32c444223559 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.164953] env[59382]: DEBUG nova.compute.provider_tree [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 928.173467] env[59382]: DEBUG nova.scheduler.client.report [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 928.186099] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 928.186568] env[59382]: DEBUG nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 928.219402] env[59382]: DEBUG nova.compute.utils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 928.220803] env[59382]: DEBUG nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Not allocating networking since 'none' was specified. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 928.230661] env[59382]: DEBUG nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 928.287858] env[59382]: DEBUG nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 928.310409] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 928.310674] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 928.310797] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 928.310977] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 928.311292] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 928.311381] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 928.311554] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 928.311713] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 928.311875] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 928.312063] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 928.312258] env[59382]: DEBUG nova.virt.hardware [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 928.313213] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddf932b6-4e34-465e-bc49-8a1741ab7c7e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.321016] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa86860c-9efd-4521-bceb-674e1f95b010 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.334577] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Instance VIF info [] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 928.340031] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Creating folder: Project (02d81ab9b2594d5fa698d04f9e983263). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 928.340294] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7b04a187-e03f-47ee-b269-cb1956cc6fd1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.349244] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Created folder: Project (02d81ab9b2594d5fa698d04f9e983263) in parent group-v459741. [ 928.349424] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Creating folder: Instances. Parent ref: group-v459797. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 928.349665] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-25371044-574f-4429-8067-abc329e6c903 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.358873] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Created folder: Instances in parent group-v459797. [ 928.359108] env[59382]: DEBUG oslo.service.loopingcall [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 928.359295] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 928.359501] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-39d85f73-510a-4085-a55c-f79f48d39eb6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.375015] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 928.375015] env[59382]: value = "task-2256760" [ 928.375015] env[59382]: _type = "Task" [ 928.375015] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 928.382077] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256760, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 928.885659] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256760, 'name': CreateVM_Task, 'duration_secs': 0.234847} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 928.885822] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 928.886524] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 928.886692] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 928.887014] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 928.887265] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-78e91c0a-0756-4863-a40f-2ccc435ae645 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.892943] env[59382]: DEBUG oslo_vmware.api [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Waiting for the task: (returnval){ [ 928.892943] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5243f74d-a787-8951-3c22-05aa2c0ed5ca" [ 928.892943] env[59382]: _type = "Task" [ 928.892943] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 928.899212] env[59382]: DEBUG oslo_vmware.api [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5243f74d-a787-8951-3c22-05aa2c0ed5ca, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 929.401779] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 929.402050] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 929.402261] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 931.705978] env[59382]: DEBUG nova.compute.manager [req-1dabe0de-35b6-48be-9027-3611c5a2c607 req-1dd091e8-21fb-4dde-bc02-ea7483038123 service nova] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Received event network-vif-deleted-ea791d88-34b4-4c82-9b95-77b33bfdd4c3 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 964.528628] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 966.526910] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 967.527774] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 968.522507] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 968.526202] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 968.526401] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Starting heal instance info cache {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9858}} [ 968.526533] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Rebuilding the list of instances to heal {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9862}} [ 968.542412] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 968.542740] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 968.542740] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 968.542837] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Didn't find any instances for network info cache update. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9944}} [ 969.526676] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 969.526838] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59382) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10477}} [ 969.527047] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 969.537231] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 969.537426] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 969.537591] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 969.537737] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 969.538820] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eaf35424-0840-4c2e-9063-afe596bcf5eb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 969.547316] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf4f6778-03f9-418a-a996-aa5b159575e3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 969.560765] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6416e72c-61bc-4836-bca8-173a97989ec0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 969.566734] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b571542-a5e6-4771-9488-4de71ac463a1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 969.595091] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181242MB free_disk=170GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 969.595245] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 969.595428] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 969.638215] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance d31427c1-9979-4617-b5a1-43aee722d88d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 969.638384] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 93e239f1-44f6-4dfa-8634-50b933aaf9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 969.638511] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 203e8cdb-621d-461a-97ba-3e3782f04d1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 969.648409] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 390366c5-ced3-4ac9-9687-c5d2895fbc1a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 969.658937] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 2dfb7e00-fea7-4186-914a-98e1e5fbe49a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 969.667960] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 81f08c14-ee4b-4954-bf53-dc02bb600279 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 969.677729] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 05e46e58-1de8-48a0-a139-c202d77e85ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 969.686516] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 55c244ec-daa2-4eef-8de3-324d0815026b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 969.695755] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance e57c71dc-fdb2-4861-b716-c6caebd6c29e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 969.704446] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 5fd51316-ab8f-4501-8389-de12a294f8da has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 969.712999] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 8ea743ab-33df-4834-b1e7-2ef7f1e1a147 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 969.713222] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 969.713368] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 969.838877] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-743ad70a-ddf3-4dc4-8097-28cd3772c790 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 969.846547] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77435e04-45c4-4e7f-9e3d-93569ed08e1f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 969.875990] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a976f7c6-cc21-44c7-9c07-11dfd080b0f5 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 969.882914] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65403363-3397-4192-8627-b1e0fe931097 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 969.895582] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 969.904545] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 969.919428] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 969.919598] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.324s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 971.919777] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 972.522232] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 972.537217] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 973.102740] env[59382]: WARNING oslo_vmware.rw_handles [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 973.102740] env[59382]: ERROR oslo_vmware.rw_handles [ 973.103464] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/34969b7a-4744-4e36-b8c5-75d42764a8bf/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 973.104993] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 973.105258] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Copying Virtual Disk [datastore1] vmware_temp/34969b7a-4744-4e36-b8c5-75d42764a8bf/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/34969b7a-4744-4e36-b8c5-75d42764a8bf/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 973.105510] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-76f5f298-44b6-4bf4-b4d3-15e2a7eb69be {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 973.113209] env[59382]: DEBUG oslo_vmware.api [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Waiting for the task: (returnval){ [ 973.113209] env[59382]: value = "task-2256761" [ 973.113209] env[59382]: _type = "Task" [ 973.113209] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 973.121218] env[59382]: DEBUG oslo_vmware.api [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Task: {'id': task-2256761, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 973.623760] env[59382]: DEBUG oslo_vmware.exceptions [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 973.624021] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 973.624616] env[59382]: ERROR nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 973.624616] env[59382]: Faults: ['InvalidArgument'] [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Traceback (most recent call last): [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] yield resources [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] self.driver.spawn(context, instance, image_meta, [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] self._fetch_image_if_missing(context, vi) [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] image_cache(vi, tmp_image_ds_loc) [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] vm_util.copy_virtual_disk( [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] session._wait_for_task(vmdk_copy_task) [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] return self.wait_for_task(task_ref) [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] return evt.wait() [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] result = hub.switch() [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] return self.greenlet.switch() [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] self.f(*self.args, **self.kw) [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] raise exceptions.translate_fault(task_info.error) [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Faults: ['InvalidArgument'] [ 973.624616] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] [ 973.625675] env[59382]: INFO nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Terminating instance [ 973.626551] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 973.626763] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 973.627261] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquiring lock "refresh_cache-93e239f1-44f6-4dfa-8634-50b933aaf9bd" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 973.627420] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquired lock "refresh_cache-93e239f1-44f6-4dfa-8634-50b933aaf9bd" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 973.627571] env[59382]: DEBUG nova.network.neutron [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 973.628449] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3b45b648-8ed3-4f58-a432-1ccbfe117d1d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 973.637101] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 973.637280] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 973.638499] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2e5cfa21-477a-41b5-aaa0-008dc0edc613 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 973.647063] env[59382]: DEBUG oslo_vmware.api [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Waiting for the task: (returnval){ [ 973.647063] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5238a86e-a9d0-ce5e-a0be-681e1e4b5ebb" [ 973.647063] env[59382]: _type = "Task" [ 973.647063] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 973.653708] env[59382]: DEBUG oslo_vmware.api [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5238a86e-a9d0-ce5e-a0be-681e1e4b5ebb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 973.659139] env[59382]: DEBUG nova.network.neutron [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 973.720172] env[59382]: DEBUG nova.network.neutron [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 973.728646] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Releasing lock "refresh_cache-93e239f1-44f6-4dfa-8634-50b933aaf9bd" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 973.729117] env[59382]: DEBUG nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 973.729359] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 973.730381] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-026d3e25-993a-43a9-923e-f0492c72b6d6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 973.737987] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 973.738211] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ca351b23-7e4a-4ac8-9d79-2ad9499be5c2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 973.766531] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 973.766741] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 973.766920] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Deleting the datastore file [datastore1] 93e239f1-44f6-4dfa-8634-50b933aaf9bd {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 973.767156] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2621da60-becf-4b22-8aa8-c0ea49806fdb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 973.773488] env[59382]: DEBUG oslo_vmware.api [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Waiting for the task: (returnval){ [ 973.773488] env[59382]: value = "task-2256763" [ 973.773488] env[59382]: _type = "Task" [ 973.773488] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 973.780722] env[59382]: DEBUG oslo_vmware.api [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Task: {'id': task-2256763, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 974.156078] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 974.156380] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Creating directory with path [datastore1] vmware_temp/0f0ca1f9-5b45-4e2e-b901-a78bae48acda/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 974.156544] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-52c367ce-6bbe-44cd-9a9d-6bfaba371cd7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.168525] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Created directory with path [datastore1] vmware_temp/0f0ca1f9-5b45-4e2e-b901-a78bae48acda/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 974.168713] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Fetch image to [datastore1] vmware_temp/0f0ca1f9-5b45-4e2e-b901-a78bae48acda/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 974.168880] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/0f0ca1f9-5b45-4e2e-b901-a78bae48acda/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 974.169572] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11e6a01f-15c6-44e3-8632-c9a71777bbbd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.176184] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46f4f8e4-e9a7-4df5-94fc-16d6a5ce46f0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.185455] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbaf0af2-cb3b-40cb-94d8-4a60a4fc146a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.216395] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1a40298-c5cf-4d88-90c7-ff8bc0adba62 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.221881] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6ccbacb1-5d46-4456-83a0-b47f1ade19a2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.247195] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 974.283069] env[59382]: DEBUG oslo_vmware.api [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Task: {'id': task-2256763, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.036615} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 974.283330] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 974.283330] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 974.283491] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 974.283660] env[59382]: INFO nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Took 0.55 seconds to destroy the instance on the hypervisor. [ 974.283889] env[59382]: DEBUG oslo.service.loopingcall [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 974.284102] env[59382]: DEBUG nova.compute.manager [-] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Skipping network deallocation for instance since networking was not requested. {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 974.286427] env[59382]: DEBUG nova.compute.claims [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 974.286627] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 974.286835] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 974.297342] env[59382]: DEBUG oslo_vmware.rw_handles [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f0ca1f9-5b45-4e2e-b901-a78bae48acda/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 974.355368] env[59382]: DEBUG oslo_vmware.rw_handles [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 974.355368] env[59382]: DEBUG oslo_vmware.rw_handles [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f0ca1f9-5b45-4e2e-b901-a78bae48acda/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 974.475705] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56ffaca2-4f60-474e-9a9f-554fac371895 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.483083] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07fdac65-7b24-4139-8be0-0970525bed4a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.511699] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-871a63bf-ada1-412c-b602-dff5f093c607 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.518285] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38c41c9a-054d-4019-a80a-4edb789c9505 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.531496] env[59382]: DEBUG nova.compute.provider_tree [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 974.539504] env[59382]: DEBUG nova.scheduler.client.report [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 974.551579] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.265s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 974.552139] env[59382]: ERROR nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 974.552139] env[59382]: Faults: ['InvalidArgument'] [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Traceback (most recent call last): [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] self.driver.spawn(context, instance, image_meta, [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] self._fetch_image_if_missing(context, vi) [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] image_cache(vi, tmp_image_ds_loc) [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] vm_util.copy_virtual_disk( [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] session._wait_for_task(vmdk_copy_task) [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] return self.wait_for_task(task_ref) [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] return evt.wait() [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] result = hub.switch() [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] return self.greenlet.switch() [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] self.f(*self.args, **self.kw) [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] raise exceptions.translate_fault(task_info.error) [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Faults: ['InvalidArgument'] [ 974.552139] env[59382]: ERROR nova.compute.manager [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] [ 974.552985] env[59382]: DEBUG nova.compute.utils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 974.554059] env[59382]: DEBUG nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Build of instance 93e239f1-44f6-4dfa-8634-50b933aaf9bd was re-scheduled: A specified parameter was not correct: fileType [ 974.554059] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 974.554474] env[59382]: DEBUG nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 974.554690] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquiring lock "refresh_cache-93e239f1-44f6-4dfa-8634-50b933aaf9bd" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 974.554839] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquired lock "refresh_cache-93e239f1-44f6-4dfa-8634-50b933aaf9bd" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 974.555036] env[59382]: DEBUG nova.network.neutron [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 974.578335] env[59382]: DEBUG nova.network.neutron [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 974.634838] env[59382]: DEBUG nova.network.neutron [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 974.642830] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Releasing lock "refresh_cache-93e239f1-44f6-4dfa-8634-50b933aaf9bd" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 974.643039] env[59382]: DEBUG nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 974.643220] env[59382]: DEBUG nova.compute.manager [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Skipping network deallocation for instance since networking was not requested. {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 974.720426] env[59382]: INFO nova.scheduler.client.report [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Deleted allocations for instance 93e239f1-44f6-4dfa-8634-50b933aaf9bd [ 974.735082] env[59382]: DEBUG oslo_concurrency.lockutils [None req-62a0a4e3-f7d4-4a1e-ae56-a3886407f1ea tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Lock "93e239f1-44f6-4dfa-8634-50b933aaf9bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 425.725s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 974.736095] env[59382]: DEBUG oslo_concurrency.lockutils [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Lock "93e239f1-44f6-4dfa-8634-50b933aaf9bd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 224.425s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 974.736310] env[59382]: DEBUG oslo_concurrency.lockutils [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquiring lock "93e239f1-44f6-4dfa-8634-50b933aaf9bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 974.736583] env[59382]: DEBUG oslo_concurrency.lockutils [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Lock "93e239f1-44f6-4dfa-8634-50b933aaf9bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 974.736706] env[59382]: DEBUG oslo_concurrency.lockutils [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Lock "93e239f1-44f6-4dfa-8634-50b933aaf9bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 974.738461] env[59382]: INFO nova.compute.manager [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Terminating instance [ 974.739906] env[59382]: DEBUG oslo_concurrency.lockutils [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquiring lock "refresh_cache-93e239f1-44f6-4dfa-8634-50b933aaf9bd" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 974.740063] env[59382]: DEBUG oslo_concurrency.lockutils [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Acquired lock "refresh_cache-93e239f1-44f6-4dfa-8634-50b933aaf9bd" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 974.740228] env[59382]: DEBUG nova.network.neutron [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 974.749480] env[59382]: DEBUG nova.compute.manager [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 974.762103] env[59382]: DEBUG nova.network.neutron [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 974.794458] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 974.794739] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 974.796254] env[59382]: INFO nova.compute.claims [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 974.819839] env[59382]: DEBUG nova.network.neutron [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 974.827930] env[59382]: DEBUG oslo_concurrency.lockutils [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Releasing lock "refresh_cache-93e239f1-44f6-4dfa-8634-50b933aaf9bd" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 974.828326] env[59382]: DEBUG nova.compute.manager [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 974.828519] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 974.829223] env[59382]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-14df293f-5019-4a9f-b62d-be969a02b0e2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.839668] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6c480ba-5d64-4c57-9179-9ad2e5a13afd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.869622] env[59382]: WARNING nova.virt.vmwareapi.vmops [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 93e239f1-44f6-4dfa-8634-50b933aaf9bd could not be found. [ 974.869835] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 974.870018] env[59382]: INFO nova.compute.manager [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 974.870255] env[59382]: DEBUG oslo.service.loopingcall [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 974.872697] env[59382]: DEBUG nova.compute.manager [-] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 974.872787] env[59382]: DEBUG nova.network.neutron [-] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 974.888899] env[59382]: DEBUG nova.network.neutron [-] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 974.898184] env[59382]: DEBUG nova.network.neutron [-] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 974.905744] env[59382]: INFO nova.compute.manager [-] [instance: 93e239f1-44f6-4dfa-8634-50b933aaf9bd] Took 0.03 seconds to deallocate network for instance. [ 974.988399] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27d27550-62d7-422e-b1a5-e6ace582de5f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.994339] env[59382]: DEBUG oslo_concurrency.lockutils [None req-f6f06f50-f04b-446f-a0e6-72c131b6ecd1 tempest-ServersAdmin275Test-1799673378 tempest-ServersAdmin275Test-1799673378-project-member] Lock "93e239f1-44f6-4dfa-8634-50b933aaf9bd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.258s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 974.998680] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1080f69-f01e-4915-ad62-17cbe9dd0223 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.029507] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cb52fc8-9c6c-41fb-81e1-d062357945f7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.037033] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82cab190-562e-48ab-9e1d-8cb3bf49f49c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.050456] env[59382]: DEBUG nova.compute.provider_tree [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 975.058640] env[59382]: DEBUG nova.scheduler.client.report [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 975.070516] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 975.070953] env[59382]: DEBUG nova.compute.manager [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 975.099739] env[59382]: DEBUG nova.compute.utils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 975.100883] env[59382]: DEBUG nova.compute.manager [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 975.101064] env[59382]: DEBUG nova.network.neutron [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 975.108653] env[59382]: DEBUG nova.compute.manager [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 975.165857] env[59382]: DEBUG nova.policy [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '88e9b08d9b0f4ea6af67c2e9a311cadb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b372c2f1cc854466849dc8615b25132e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 975.168868] env[59382]: DEBUG nova.compute.manager [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 975.188802] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 975.189043] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 975.189205] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 975.189386] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 975.189531] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 975.189683] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 975.189888] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 975.190056] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 975.190223] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 975.190381] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 975.190551] env[59382]: DEBUG nova.virt.hardware [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 975.191434] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f2719ab-1d03-4475-8b45-f6aaf6ec63a4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.199037] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef9f887f-b668-4e5d-a0a5-c5aba76797ae {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 975.441309] env[59382]: DEBUG nova.network.neutron [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Successfully created port: 2265de5a-b9e6-429e-80ef-67f11ac0e930 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 975.879973] env[59382]: DEBUG nova.compute.manager [req-9b9b78f5-61a0-42a0-81ce-c1fd93afa61c req-d6175cf3-adbf-4da1-8e7c-e8c583fbecbe service nova] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Received event network-vif-plugged-2265de5a-b9e6-429e-80ef-67f11ac0e930 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 975.880217] env[59382]: DEBUG oslo_concurrency.lockutils [req-9b9b78f5-61a0-42a0-81ce-c1fd93afa61c req-d6175cf3-adbf-4da1-8e7c-e8c583fbecbe service nova] Acquiring lock "390366c5-ced3-4ac9-9687-c5d2895fbc1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 975.880422] env[59382]: DEBUG oslo_concurrency.lockutils [req-9b9b78f5-61a0-42a0-81ce-c1fd93afa61c req-d6175cf3-adbf-4da1-8e7c-e8c583fbecbe service nova] Lock "390366c5-ced3-4ac9-9687-c5d2895fbc1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 975.880588] env[59382]: DEBUG oslo_concurrency.lockutils [req-9b9b78f5-61a0-42a0-81ce-c1fd93afa61c req-d6175cf3-adbf-4da1-8e7c-e8c583fbecbe service nova] Lock "390366c5-ced3-4ac9-9687-c5d2895fbc1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 975.880777] env[59382]: DEBUG nova.compute.manager [req-9b9b78f5-61a0-42a0-81ce-c1fd93afa61c req-d6175cf3-adbf-4da1-8e7c-e8c583fbecbe service nova] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] No waiting events found dispatching network-vif-plugged-2265de5a-b9e6-429e-80ef-67f11ac0e930 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 975.880904] env[59382]: WARNING nova.compute.manager [req-9b9b78f5-61a0-42a0-81ce-c1fd93afa61c req-d6175cf3-adbf-4da1-8e7c-e8c583fbecbe service nova] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Received unexpected event network-vif-plugged-2265de5a-b9e6-429e-80ef-67f11ac0e930 for instance with vm_state building and task_state spawning. [ 975.958027] env[59382]: DEBUG nova.network.neutron [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Successfully updated port: 2265de5a-b9e6-429e-80ef-67f11ac0e930 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 975.971952] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Acquiring lock "refresh_cache-390366c5-ced3-4ac9-9687-c5d2895fbc1a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 975.972050] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Acquired lock "refresh_cache-390366c5-ced3-4ac9-9687-c5d2895fbc1a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 975.972149] env[59382]: DEBUG nova.network.neutron [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 976.026136] env[59382]: DEBUG nova.network.neutron [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 976.262720] env[59382]: DEBUG nova.network.neutron [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Updating instance_info_cache with network_info: [{"id": "2265de5a-b9e6-429e-80ef-67f11ac0e930", "address": "fa:16:3e:78:db:55", "network": {"id": "4e2d07e6-9d63-4800-a478-29fc9757a7b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-701284724-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b372c2f1cc854466849dc8615b25132e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da0e5087-d65b-416f-90fe-beaa9c534ad3", "external-id": "nsx-vlan-transportzone-522", "segmentation_id": 522, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2265de5a-b9", "ovs_interfaceid": "2265de5a-b9e6-429e-80ef-67f11ac0e930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 976.272940] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Releasing lock "refresh_cache-390366c5-ced3-4ac9-9687-c5d2895fbc1a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 976.273240] env[59382]: DEBUG nova.compute.manager [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Instance network_info: |[{"id": "2265de5a-b9e6-429e-80ef-67f11ac0e930", "address": "fa:16:3e:78:db:55", "network": {"id": "4e2d07e6-9d63-4800-a478-29fc9757a7b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-701284724-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b372c2f1cc854466849dc8615b25132e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da0e5087-d65b-416f-90fe-beaa9c534ad3", "external-id": "nsx-vlan-transportzone-522", "segmentation_id": 522, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2265de5a-b9", "ovs_interfaceid": "2265de5a-b9e6-429e-80ef-67f11ac0e930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 976.273603] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:db:55', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'da0e5087-d65b-416f-90fe-beaa9c534ad3', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2265de5a-b9e6-429e-80ef-67f11ac0e930', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 976.281199] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Creating folder: Project (b372c2f1cc854466849dc8615b25132e). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 976.281672] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e5f61790-1fab-49f8-86d9-b8ea496be249 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.292994] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Created folder: Project (b372c2f1cc854466849dc8615b25132e) in parent group-v459741. [ 976.293181] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Creating folder: Instances. Parent ref: group-v459800. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 976.293397] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e21c6e8f-c9a6-4949-a653-da3177aae47b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.301985] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Created folder: Instances in parent group-v459800. [ 976.302211] env[59382]: DEBUG oslo.service.loopingcall [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 976.302449] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 976.302634] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a56300f0-226e-4e60-b7fc-844d8383f79e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.320372] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 976.320372] env[59382]: value = "task-2256766" [ 976.320372] env[59382]: _type = "Task" [ 976.320372] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 976.327278] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256766, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 976.829197] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256766, 'name': CreateVM_Task, 'duration_secs': 0.306634} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 976.829357] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 976.830102] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 976.830272] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 976.830584] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 976.830808] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b811c66e-9949-412c-b466-451a9d90a046 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.834912] env[59382]: DEBUG oslo_vmware.api [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Waiting for the task: (returnval){ [ 976.834912] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52d3c5b2-b391-4b1c-ee2c-a3a4748c54c4" [ 976.834912] env[59382]: _type = "Task" [ 976.834912] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 976.848499] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 976.848712] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 976.848907] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 977.905032] env[59382]: DEBUG nova.compute.manager [req-89879567-b50c-4b3c-97e9-b71a1d28fb3a req-160cebe6-4cc8-4081-afed-3259c9bddb90 service nova] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Received event network-changed-2265de5a-b9e6-429e-80ef-67f11ac0e930 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 977.905331] env[59382]: DEBUG nova.compute.manager [req-89879567-b50c-4b3c-97e9-b71a1d28fb3a req-160cebe6-4cc8-4081-afed-3259c9bddb90 service nova] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Refreshing instance network info cache due to event network-changed-2265de5a-b9e6-429e-80ef-67f11ac0e930. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 977.905452] env[59382]: DEBUG oslo_concurrency.lockutils [req-89879567-b50c-4b3c-97e9-b71a1d28fb3a req-160cebe6-4cc8-4081-afed-3259c9bddb90 service nova] Acquiring lock "refresh_cache-390366c5-ced3-4ac9-9687-c5d2895fbc1a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 977.905594] env[59382]: DEBUG oslo_concurrency.lockutils [req-89879567-b50c-4b3c-97e9-b71a1d28fb3a req-160cebe6-4cc8-4081-afed-3259c9bddb90 service nova] Acquired lock "refresh_cache-390366c5-ced3-4ac9-9687-c5d2895fbc1a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 977.905752] env[59382]: DEBUG nova.network.neutron [req-89879567-b50c-4b3c-97e9-b71a1d28fb3a req-160cebe6-4cc8-4081-afed-3259c9bddb90 service nova] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Refreshing network info cache for port 2265de5a-b9e6-429e-80ef-67f11ac0e930 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 978.155014] env[59382]: DEBUG nova.network.neutron [req-89879567-b50c-4b3c-97e9-b71a1d28fb3a req-160cebe6-4cc8-4081-afed-3259c9bddb90 service nova] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Updated VIF entry in instance network info cache for port 2265de5a-b9e6-429e-80ef-67f11ac0e930. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 978.155443] env[59382]: DEBUG nova.network.neutron [req-89879567-b50c-4b3c-97e9-b71a1d28fb3a req-160cebe6-4cc8-4081-afed-3259c9bddb90 service nova] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Updating instance_info_cache with network_info: [{"id": "2265de5a-b9e6-429e-80ef-67f11ac0e930", "address": "fa:16:3e:78:db:55", "network": {"id": "4e2d07e6-9d63-4800-a478-29fc9757a7b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-701284724-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b372c2f1cc854466849dc8615b25132e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "da0e5087-d65b-416f-90fe-beaa9c534ad3", "external-id": "nsx-vlan-transportzone-522", "segmentation_id": 522, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2265de5a-b9", "ovs_interfaceid": "2265de5a-b9e6-429e-80ef-67f11ac0e930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 978.166183] env[59382]: DEBUG oslo_concurrency.lockutils [req-89879567-b50c-4b3c-97e9-b71a1d28fb3a req-160cebe6-4cc8-4081-afed-3259c9bddb90 service nova] Releasing lock "refresh_cache-390366c5-ced3-4ac9-9687-c5d2895fbc1a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 979.471983] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquiring lock "203e8cdb-621d-461a-97ba-3e3782f04d1d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1019.422198] env[59382]: WARNING oslo_vmware.rw_handles [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1019.422198] env[59382]: ERROR oslo_vmware.rw_handles [ 1019.422717] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/0f0ca1f9-5b45-4e2e-b901-a78bae48acda/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1019.424394] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1019.424636] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Copying Virtual Disk [datastore1] vmware_temp/0f0ca1f9-5b45-4e2e-b901-a78bae48acda/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/0f0ca1f9-5b45-4e2e-b901-a78bae48acda/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1019.424914] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-29e33c44-99be-4cd0-8d6f-aa007294ea14 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1019.433335] env[59382]: DEBUG oslo_vmware.api [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Waiting for the task: (returnval){ [ 1019.433335] env[59382]: value = "task-2256767" [ 1019.433335] env[59382]: _type = "Task" [ 1019.433335] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1019.440796] env[59382]: DEBUG oslo_vmware.api [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Task: {'id': task-2256767, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1019.943882] env[59382]: DEBUG oslo_vmware.exceptions [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1019.944141] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1019.944673] env[59382]: ERROR nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1019.944673] env[59382]: Faults: ['InvalidArgument'] [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Traceback (most recent call last): [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] yield resources [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] self.driver.spawn(context, instance, image_meta, [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] self._fetch_image_if_missing(context, vi) [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] image_cache(vi, tmp_image_ds_loc) [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] vm_util.copy_virtual_disk( [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] session._wait_for_task(vmdk_copy_task) [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] return self.wait_for_task(task_ref) [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] return evt.wait() [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] result = hub.switch() [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] return self.greenlet.switch() [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] self.f(*self.args, **self.kw) [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] raise exceptions.translate_fault(task_info.error) [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Faults: ['InvalidArgument'] [ 1019.944673] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] [ 1019.945552] env[59382]: INFO nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Terminating instance [ 1019.946535] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1019.946736] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1019.946960] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-951a2ee3-4a47-4e35-a8ba-168f281a9195 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1019.949102] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1019.949298] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1019.950024] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9846412-ac4f-481c-a2e4-9fcf866f3faa {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1019.956937] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1019.957166] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1d710670-39b6-4b85-8e79-78ec7b406655 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1019.959221] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1019.959394] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1019.960296] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-343b6d55-10a9-4fa1-a8a5-b44f9e048ea9 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1019.964864] env[59382]: DEBUG oslo_vmware.api [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Waiting for the task: (returnval){ [ 1019.964864] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52c37e51-50c8-3a40-73d6-51d3bf38149e" [ 1019.964864] env[59382]: _type = "Task" [ 1019.964864] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1019.971888] env[59382]: DEBUG oslo_vmware.api [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52c37e51-50c8-3a40-73d6-51d3bf38149e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1020.313919] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1020.314162] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1020.314338] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Deleting the datastore file [datastore1] d31427c1-9979-4617-b5a1-43aee722d88d {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1020.314607] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-300cf428-9625-4b89-bd20-83ca50418db8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.321451] env[59382]: DEBUG oslo_vmware.api [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Waiting for the task: (returnval){ [ 1020.321451] env[59382]: value = "task-2256769" [ 1020.321451] env[59382]: _type = "Task" [ 1020.321451] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1020.329212] env[59382]: DEBUG oslo_vmware.api [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Task: {'id': task-2256769, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1020.475016] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1020.475362] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Creating directory with path [datastore1] vmware_temp/f39ce4f5-89fd-4f7a-9128-2fb8a2f5fbb1/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1020.475558] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-50ede8a3-6108-4304-baf3-3b27d0eecdcb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.487034] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Created directory with path [datastore1] vmware_temp/f39ce4f5-89fd-4f7a-9128-2fb8a2f5fbb1/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1020.487299] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Fetch image to [datastore1] vmware_temp/f39ce4f5-89fd-4f7a-9128-2fb8a2f5fbb1/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1020.487528] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/f39ce4f5-89fd-4f7a-9128-2fb8a2f5fbb1/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1020.488669] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3dbbf40-5fa0-4f2e-a61b-d57e08b77594 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.497981] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07da4ea8-f5b6-45e9-9917-74202faf9891 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.506618] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d83d964c-a1ed-4669-aa32-70055221c1d1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.539385] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-575ddb1e-95b5-4743-9592-6fa5e65e4fb8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.545726] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-75534c9e-4b41-4a20-99ba-4adaaf635b1b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.566099] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1020.774078] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1020.775276] env[59382]: ERROR nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Traceback (most recent call last): [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] result = getattr(controller, method)(*args, **kwargs) [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self._get(image_id) [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] resp, body = self.http_client.get(url, headers=header) [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self.request(url, 'GET', **kwargs) [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self._handle_response(resp) [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise exc.from_response(resp, resp.content) [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] During handling of the above exception, another exception occurred: [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Traceback (most recent call last): [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] yield resources [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self.driver.spawn(context, instance, image_meta, [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self._fetch_image_if_missing(context, vi) [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] image_fetch(context, vi, tmp_image_ds_loc) [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] images.fetch_image( [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1020.775276] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] metadata = IMAGE_API.get(context, image_ref) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return session.show(context, image_id, [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] _reraise_translated_image_exception(image_id) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise new_exc.with_traceback(exc_trace) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] result = getattr(controller, method)(*args, **kwargs) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self._get(image_id) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] resp, body = self.http_client.get(url, headers=header) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self.request(url, 'GET', **kwargs) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self._handle_response(resp) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise exc.from_response(resp, resp.content) [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1020.776433] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1020.776433] env[59382]: INFO nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Terminating instance [ 1020.777290] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1020.777504] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1020.778157] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1020.778350] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1020.778605] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e6a9eca1-94cf-4385-a726-bdc0f472943d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.781495] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1420f94c-c066-4fa2-b740-bc947c9d4517 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.788561] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1020.788796] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-29a8a259-b841-4aa6-9ccc-15b9495248ca {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.790924] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1020.791114] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1020.792059] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dda2c848-028c-4b83-80b6-787949b05b14 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.796843] env[59382]: DEBUG oslo_vmware.api [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Waiting for the task: (returnval){ [ 1020.796843] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]528e0580-738b-3fe5-e986-dc4a7b4da3a9" [ 1020.796843] env[59382]: _type = "Task" [ 1020.796843] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1020.803834] env[59382]: DEBUG oslo_vmware.api [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]528e0580-738b-3fe5-e986-dc4a7b4da3a9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1020.829606] env[59382]: DEBUG oslo_vmware.api [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Task: {'id': task-2256769, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072495} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1020.829840] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1020.830029] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1020.830203] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1020.830374] env[59382]: INFO nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Took 0.88 seconds to destroy the instance on the hypervisor. [ 1020.832376] env[59382]: DEBUG nova.compute.claims [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1020.832561] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1020.832798] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1020.864657] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1020.864857] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1020.865046] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Deleting the datastore file [datastore1] c2f5545d-884a-4166-a93b-810ef311c2e6 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1020.865313] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b2330b8c-43d9-489c-8543-e23156a84cde {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.874711] env[59382]: DEBUG oslo_vmware.api [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Waiting for the task: (returnval){ [ 1020.874711] env[59382]: value = "task-2256771" [ 1020.874711] env[59382]: _type = "Task" [ 1020.874711] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1020.882338] env[59382]: DEBUG oslo_vmware.api [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Task: {'id': task-2256771, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1020.984632] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4d8e1bf-ee32-4956-883a-4db98fadf4b8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.991675] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db044e43-12a8-4cd2-a4fc-4a194270bcc2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.020649] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbe19a30-0b48-4173-bc4b-ebee0966c5c8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.027157] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36eeb6dc-1a29-47a4-bbfb-712e02f08e8b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.040622] env[59382]: DEBUG nova.compute.provider_tree [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1021.049812] env[59382]: DEBUG nova.scheduler.client.report [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1021.062519] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.230s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.063038] env[59382]: ERROR nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1021.063038] env[59382]: Faults: ['InvalidArgument'] [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Traceback (most recent call last): [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] self.driver.spawn(context, instance, image_meta, [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] self._fetch_image_if_missing(context, vi) [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] image_cache(vi, tmp_image_ds_loc) [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] vm_util.copy_virtual_disk( [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] session._wait_for_task(vmdk_copy_task) [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] return self.wait_for_task(task_ref) [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] return evt.wait() [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] result = hub.switch() [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] return self.greenlet.switch() [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] self.f(*self.args, **self.kw) [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] raise exceptions.translate_fault(task_info.error) [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Faults: ['InvalidArgument'] [ 1021.063038] env[59382]: ERROR nova.compute.manager [instance: d31427c1-9979-4617-b5a1-43aee722d88d] [ 1021.063893] env[59382]: DEBUG nova.compute.utils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1021.065119] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Build of instance d31427c1-9979-4617-b5a1-43aee722d88d was re-scheduled: A specified parameter was not correct: fileType [ 1021.065119] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1021.065494] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1021.065664] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1021.065814] env[59382]: DEBUG nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1021.065970] env[59382]: DEBUG nova.network.neutron [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1021.307226] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1021.307516] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Creating directory with path [datastore1] vmware_temp/681c1d75-2bad-4e52-b042-9b29d95b0e1c/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1021.307753] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4772394c-c880-4fde-820e-dbac78eebf30 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.319041] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Created directory with path [datastore1] vmware_temp/681c1d75-2bad-4e52-b042-9b29d95b0e1c/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1021.319250] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Fetch image to [datastore1] vmware_temp/681c1d75-2bad-4e52-b042-9b29d95b0e1c/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1021.319422] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/681c1d75-2bad-4e52-b042-9b29d95b0e1c/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1021.320170] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df1f202e-e87c-44a0-85e0-e08ce49f5f19 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.327628] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15c37911-0da0-4ba7-986f-5f4eec040510 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.330090] env[59382]: DEBUG nova.network.neutron [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1021.338346] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9b7ec66-93fa-4fcd-95d2-d154dbcf5153 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.342815] env[59382]: INFO nova.compute.manager [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Took 0.28 seconds to deallocate network for instance. [ 1021.371980] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0424c532-cfcd-4abb-a9e1-2fd182053128 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.380344] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4ce5ffaa-fb59-41bb-a02a-8a464b17d398 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.384850] env[59382]: DEBUG oslo_vmware.api [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Task: {'id': task-2256771, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070149} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1021.385432] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1021.385627] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1021.385822] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1021.386088] env[59382]: INFO nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1021.390052] env[59382]: DEBUG nova.compute.claims [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1021.390223] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1021.390434] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1021.405235] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1021.416527] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.417319] env[59382]: DEBUG nova.compute.utils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Instance c2f5545d-884a-4166-a93b-810ef311c2e6 could not be found. {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1021.418830] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Instance disappeared during build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1021.419100] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1021.419157] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1021.419350] env[59382]: DEBUG nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1021.419527] env[59382]: DEBUG nova.network.neutron [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1021.450902] env[59382]: INFO nova.scheduler.client.report [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Deleted allocations for instance d31427c1-9979-4617-b5a1-43aee722d88d [ 1021.466852] env[59382]: DEBUG oslo_concurrency.lockutils [None req-db703bfa-13af-43de-844d-c92bcaa821ea tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Lock "d31427c1-9979-4617-b5a1-43aee722d88d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 478.175s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.468156] env[59382]: DEBUG oslo_concurrency.lockutils [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Lock "d31427c1-9979-4617-b5a1-43aee722d88d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 279.850s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1021.468156] env[59382]: DEBUG oslo_concurrency.lockutils [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Acquiring lock "d31427c1-9979-4617-b5a1-43aee722d88d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1021.468367] env[59382]: DEBUG oslo_concurrency.lockutils [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Lock "d31427c1-9979-4617-b5a1-43aee722d88d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1021.468533] env[59382]: DEBUG oslo_concurrency.lockutils [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Lock "d31427c1-9979-4617-b5a1-43aee722d88d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.470333] env[59382]: INFO nova.compute.manager [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Terminating instance [ 1021.471982] env[59382]: DEBUG nova.compute.manager [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1021.472189] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1021.472796] env[59382]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-772dcebc-d4a7-430e-b761-e7e895b11349 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.482325] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b0756f5-a60f-46e3-ac96-4ad488421dfc {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.494779] env[59382]: DEBUG nova.compute.manager [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1021.514275] env[59382]: WARNING nova.virt.vmwareapi.vmops [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d31427c1-9979-4617-b5a1-43aee722d88d could not be found. [ 1021.514502] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1021.514679] env[59382]: INFO nova.compute.manager [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1021.514915] env[59382]: DEBUG oslo.service.loopingcall [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1021.515901] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1021.516627] env[59382]: ERROR nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Traceback (most recent call last): [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] result = getattr(controller, method)(*args, **kwargs) [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self._get(image_id) [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] resp, body = self.http_client.get(url, headers=header) [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self.request(url, 'GET', **kwargs) [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self._handle_response(resp) [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise exc.from_response(resp, resp.content) [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] During handling of the above exception, another exception occurred: [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Traceback (most recent call last): [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] yield resources [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self.driver.spawn(context, instance, image_meta, [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self._fetch_image_if_missing(context, vi) [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] image_fetch(context, vi, tmp_image_ds_loc) [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] images.fetch_image( [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1021.516627] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] metadata = IMAGE_API.get(context, image_ref) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return session.show(context, image_id, [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] _reraise_translated_image_exception(image_id) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise new_exc.with_traceback(exc_trace) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] result = getattr(controller, method)(*args, **kwargs) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self._get(image_id) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] resp, body = self.http_client.get(url, headers=header) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self.request(url, 'GET', **kwargs) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self._handle_response(resp) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise exc.from_response(resp, resp.content) [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1021.517833] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1021.517833] env[59382]: INFO nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Terminating instance [ 1021.518482] env[59382]: DEBUG nova.compute.manager [-] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1021.518482] env[59382]: DEBUG nova.network.neutron [-] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1021.519343] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1021.519343] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1021.519831] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1021.520033] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1021.520233] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4fe331d8-6de9-4878-8cf7-403fc69979cf {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.522557] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-676b4b90-eb6a-4712-a062-0f3a9ce4e658 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.529655] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1021.529912] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c78e62dc-07bf-4ec7-b1c7-d2390638b0ae {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.532591] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1021.532764] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1021.533962] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1ee9ddb3-e249-429e-95b0-07f887074c5f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.542728] env[59382]: DEBUG oslo_vmware.api [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Waiting for the task: (returnval){ [ 1021.542728] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52b61fd4-8eaa-1b42-4fbf-12e0620a8b2e" [ 1021.542728] env[59382]: _type = "Task" [ 1021.542728] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1021.552416] env[59382]: DEBUG oslo_vmware.api [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52b61fd4-8eaa-1b42-4fbf-12e0620a8b2e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1021.556482] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1021.556709] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1021.558150] env[59382]: INFO nova.compute.claims [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1021.603754] env[59382]: DEBUG nova.network.neutron [-] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1021.609861] env[59382]: DEBUG neutronclient.v2_0.client [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59382) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1021.612753] env[59382]: ERROR nova.compute.manager [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Traceback (most recent call last): [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] result = getattr(controller, method)(*args, **kwargs) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self._get(image_id) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] resp, body = self.http_client.get(url, headers=header) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self.request(url, 'GET', **kwargs) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self._handle_response(resp) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise exc.from_response(resp, resp.content) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] During handling of the above exception, another exception occurred: [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Traceback (most recent call last): [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self.driver.spawn(context, instance, image_meta, [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self._fetch_image_if_missing(context, vi) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] image_fetch(context, vi, tmp_image_ds_loc) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] images.fetch_image( [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] metadata = IMAGE_API.get(context, image_ref) [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return session.show(context, image_id, [ 1021.612753] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] _reraise_translated_image_exception(image_id) [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise new_exc.with_traceback(exc_trace) [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] result = getattr(controller, method)(*args, **kwargs) [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self._get(image_id) [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] resp, body = self.http_client.get(url, headers=header) [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self.request(url, 'GET', **kwargs) [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self._handle_response(resp) [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise exc.from_response(resp, resp.content) [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] During handling of the above exception, another exception occurred: [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Traceback (most recent call last): [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self._build_and_run_instance(context, instance, image, [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] with excutils.save_and_reraise_exception(): [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self.force_reraise() [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise self.value [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] with self.rt.instance_claim(context, instance, node, allocs, [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self.abort() [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1021.613788] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return f(*args, **kwargs) [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self._unset_instance_host_and_node(instance) [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] instance.save() [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] updates, result = self.indirection_api.object_action( [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return cctxt.call(context, 'object_action', objinst=objinst, [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] result = self.transport._send( [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self._driver.send(target, ctxt, message, [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise result [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] nova.exception_Remote.InstanceNotFound_Remote: Instance c2f5545d-884a-4166-a93b-810ef311c2e6 could not be found. [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Traceback (most recent call last): [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return getattr(target, method)(*args, **kwargs) [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return fn(self, *args, **kwargs) [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] old_ref, inst_ref = db.instance_update_and_get_original( [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return f(*args, **kwargs) [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] with excutils.save_and_reraise_exception() as ectxt: [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self.force_reraise() [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise self.value [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return f(*args, **kwargs) [ 1021.614897] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return f(context, *args, **kwargs) [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise exception.InstanceNotFound(instance_id=uuid) [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] nova.exception.InstanceNotFound: Instance c2f5545d-884a-4166-a93b-810ef311c2e6 could not be found. [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] During handling of the above exception, another exception occurred: [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Traceback (most recent call last): [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] ret = obj(*args, **kwargs) [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] exception_handler_v20(status_code, error_body) [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise client_exc(message=error_message, [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Neutron server returns request_ids: ['req-d3a0b637-49f8-412e-a89b-a09d8d7b28fe'] [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] During handling of the above exception, another exception occurred: [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Traceback (most recent call last): [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self._deallocate_network(context, instance, requested_networks) [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self.network_api.deallocate_for_instance( [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] data = neutron.list_ports(**search_opts) [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] ret = obj(*args, **kwargs) [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self.list('ports', self.ports_path, retrieve_all, [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] ret = obj(*args, **kwargs) [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1021.615976] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] for r in self._pagination(collection, path, **params): [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] res = self.get(path, params=params) [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] ret = obj(*args, **kwargs) [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self.retry_request("GET", action, body=body, [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] ret = obj(*args, **kwargs) [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] return self.do_request(method, action, body=body, [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] ret = obj(*args, **kwargs) [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] self._handle_fault_response(status_code, replybody, resp) [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] raise exception.Unauthorized() [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] nova.exception.Unauthorized: Not authorized. [ 1021.617019] env[59382]: ERROR nova.compute.manager [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] [ 1021.617019] env[59382]: INFO nova.compute.manager [-] [instance: d31427c1-9979-4617-b5a1-43aee722d88d] Took 0.10 seconds to deallocate network for instance. [ 1021.635091] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1021.635336] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1021.635519] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Deleting the datastore file [datastore1] feea4bca-d134-475f-81b9-c8415bacf1f1 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1021.636607] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b802e1ca-4a43-4453-b9e5-a0dda8998795 tempest-ServerDiskConfigTestJSON-1273022575 tempest-ServerDiskConfigTestJSON-1273022575-project-member] Lock "c2f5545d-884a-4166-a93b-810ef311c2e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 399.709s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.636843] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-80a6b637-128e-4713-8f57-9f938a78c589 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.647113] env[59382]: DEBUG oslo_vmware.api [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Waiting for the task: (returnval){ [ 1021.647113] env[59382]: value = "task-2256773" [ 1021.647113] env[59382]: _type = "Task" [ 1021.647113] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1021.658859] env[59382]: DEBUG nova.compute.manager [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1021.661221] env[59382]: DEBUG oslo_vmware.api [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Task: {'id': task-2256773, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1021.702618] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1021.709478] env[59382]: DEBUG oslo_concurrency.lockutils [None req-484a1810-7097-4305-9da2-9558b690c820 tempest-TenantUsagesTestJSON-208037858 tempest-TenantUsagesTestJSON-208037858-project-member] Lock "d31427c1-9979-4617-b5a1-43aee722d88d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.241s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.753325] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cb1ef3f-a355-4649-94e6-3882ea46516c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.760515] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-273a01db-8b20-4b2e-aade-f21c265b6183 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.789386] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0f8d42c-d5ab-4ce5-87b7-d878c3304dd4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.795746] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de5b4259-9a20-4be5-a3b2-5bc234aa3862 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.808722] env[59382]: DEBUG nova.compute.provider_tree [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1021.816433] env[59382]: DEBUG nova.scheduler.client.report [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1021.828648] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.829099] env[59382]: DEBUG nova.compute.manager [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1021.831567] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.129s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1021.832482] env[59382]: INFO nova.compute.claims [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1021.861320] env[59382]: DEBUG nova.compute.utils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1021.862483] env[59382]: DEBUG nova.compute.manager [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1021.862727] env[59382]: DEBUG nova.network.neutron [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1021.869177] env[59382]: DEBUG nova.compute.manager [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1021.919455] env[59382]: DEBUG nova.policy [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54933574bfc4dafa8ed2bace4597bbd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec3ea647eaf467cbd033067f6e4dbfa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 1021.927300] env[59382]: DEBUG nova.compute.manager [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1021.949230] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1021.949488] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1021.949647] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1021.949867] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1021.950029] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1021.950178] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1021.950383] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1021.950540] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1021.950704] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1021.950864] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1021.951082] env[59382]: DEBUG nova.virt.hardware [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1021.954109] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-233801ed-5317-4bc4-80fd-578b4ef30a9e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.962035] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a736673f-7f38-444a-9791-e09e3340e2cf {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.000417] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-499b240b-b79c-4ca1-a31f-4bcb1ae03680 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.007791] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c1f3eaa-4847-4595-83df-ae3fa1f6af3f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.039015] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cad4eea-c307-4a78-9673-3938d70b2cd2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.049558] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ed29032-091b-47de-848a-03ef9e0db0d2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.065273] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1022.065508] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Creating directory with path [datastore1] vmware_temp/9f34edbf-d403-413a-9d5b-bc221921c7b2/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1022.065916] env[59382]: DEBUG nova.compute.provider_tree [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1022.067011] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-117d264f-e0c3-4692-a8f2-b2e2c92b8d0f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.074684] env[59382]: DEBUG nova.scheduler.client.report [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1022.084177] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Created directory with path [datastore1] vmware_temp/9f34edbf-d403-413a-9d5b-bc221921c7b2/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1022.084177] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Fetch image to [datastore1] vmware_temp/9f34edbf-d403-413a-9d5b-bc221921c7b2/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1022.084177] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/9f34edbf-d403-413a-9d5b-bc221921c7b2/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1022.084177] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78f9e5f8-3946-4571-95e2-f90869595b26 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.091546] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93812fd9-ade3-401e-b65f-23175132ff6c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.094823] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1022.095288] env[59382]: DEBUG nova.compute.manager [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1022.106435] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3260221a-669b-4635-95ec-ee2563564ad8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.139251] env[59382]: DEBUG nova.compute.utils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1022.144019] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-447aac8a-0e12-4d1b-a100-ea3203b7a73f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.144019] env[59382]: DEBUG nova.compute.manager [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1022.144203] env[59382]: DEBUG nova.network.neutron [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1022.154568] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-614aee27-e9c2-4c9f-aaa4-2a5827621dae {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.155298] env[59382]: DEBUG nova.compute.manager [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1022.160565] env[59382]: DEBUG oslo_vmware.api [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Task: {'id': task-2256773, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.080921} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1022.162453] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1022.162453] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1022.162453] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1022.162453] env[59382]: INFO nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1022.165113] env[59382]: DEBUG nova.compute.claims [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1022.165113] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1022.165113] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1022.182940] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1022.197512] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.033s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1022.198242] env[59382]: DEBUG nova.compute.utils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Instance feea4bca-d134-475f-81b9-c8415bacf1f1 could not be found. {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1022.202135] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Instance disappeared during build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1022.202311] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1022.202502] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1022.202667] env[59382]: DEBUG nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1022.202834] env[59382]: DEBUG nova.network.neutron [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1022.237121] env[59382]: DEBUG nova.compute.manager [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1022.241144] env[59382]: DEBUG nova.policy [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54933574bfc4dafa8ed2bace4597bbd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec3ea647eaf467cbd033067f6e4dbfa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 1022.241144] env[59382]: DEBUG neutronclient.v2_0.client [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59382) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1022.242277] env[59382]: ERROR nova.compute.manager [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Traceback (most recent call last): [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] result = getattr(controller, method)(*args, **kwargs) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self._get(image_id) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] resp, body = self.http_client.get(url, headers=header) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self.request(url, 'GET', **kwargs) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self._handle_response(resp) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise exc.from_response(resp, resp.content) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] During handling of the above exception, another exception occurred: [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Traceback (most recent call last): [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self.driver.spawn(context, instance, image_meta, [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self._fetch_image_if_missing(context, vi) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] image_fetch(context, vi, tmp_image_ds_loc) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] images.fetch_image( [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] metadata = IMAGE_API.get(context, image_ref) [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return session.show(context, image_id, [ 1022.242277] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] _reraise_translated_image_exception(image_id) [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise new_exc.with_traceback(exc_trace) [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] result = getattr(controller, method)(*args, **kwargs) [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self._get(image_id) [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] resp, body = self.http_client.get(url, headers=header) [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self.request(url, 'GET', **kwargs) [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self._handle_response(resp) [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise exc.from_response(resp, resp.content) [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] During handling of the above exception, another exception occurred: [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Traceback (most recent call last): [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self._build_and_run_instance(context, instance, image, [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] with excutils.save_and_reraise_exception(): [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self.force_reraise() [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise self.value [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] with self.rt.instance_claim(context, instance, node, allocs, [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self.abort() [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1022.243356] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return f(*args, **kwargs) [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self._unset_instance_host_and_node(instance) [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] instance.save() [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] updates, result = self.indirection_api.object_action( [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return cctxt.call(context, 'object_action', objinst=objinst, [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] result = self.transport._send( [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self._driver.send(target, ctxt, message, [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise result [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] nova.exception_Remote.InstanceNotFound_Remote: Instance feea4bca-d134-475f-81b9-c8415bacf1f1 could not be found. [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Traceback (most recent call last): [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return getattr(target, method)(*args, **kwargs) [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return fn(self, *args, **kwargs) [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] old_ref, inst_ref = db.instance_update_and_get_original( [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return f(*args, **kwargs) [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] with excutils.save_and_reraise_exception() as ectxt: [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self.force_reraise() [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise self.value [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return f(*args, **kwargs) [ 1022.245475] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return f(context, *args, **kwargs) [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise exception.InstanceNotFound(instance_id=uuid) [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] nova.exception.InstanceNotFound: Instance feea4bca-d134-475f-81b9-c8415bacf1f1 could not be found. [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] During handling of the above exception, another exception occurred: [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Traceback (most recent call last): [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] ret = obj(*args, **kwargs) [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] exception_handler_v20(status_code, error_body) [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise client_exc(message=error_message, [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Neutron server returns request_ids: ['req-75822a9e-31b5-4e12-bf92-d95d8626d8af'] [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] During handling of the above exception, another exception occurred: [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Traceback (most recent call last): [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self._deallocate_network(context, instance, requested_networks) [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self.network_api.deallocate_for_instance( [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] data = neutron.list_ports(**search_opts) [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] ret = obj(*args, **kwargs) [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self.list('ports', self.ports_path, retrieve_all, [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] ret = obj(*args, **kwargs) [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1022.246624] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] for r in self._pagination(collection, path, **params): [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] res = self.get(path, params=params) [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] ret = obj(*args, **kwargs) [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self.retry_request("GET", action, body=body, [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] ret = obj(*args, **kwargs) [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] return self.do_request(method, action, body=body, [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] ret = obj(*args, **kwargs) [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] self._handle_fault_response(status_code, replybody, resp) [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] raise exception.Unauthorized() [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] nova.exception.Unauthorized: Not authorized. [ 1022.247870] env[59382]: ERROR nova.compute.manager [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] [ 1022.256891] env[59382]: DEBUG nova.network.neutron [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Successfully created port: cf7c2d90-e7e6-4c8e-a1ea-abbebb909958 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1022.271396] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f873c82-010c-45ed-b04c-3042265cbc95 tempest-ServersTestMultiNic-1590974874 tempest-ServersTestMultiNic-1590974874-project-member] Lock "feea4bca-d134-475f-81b9-c8415bacf1f1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 397.606s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1022.284145] env[59382]: DEBUG nova.compute.manager [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1022.314395] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1022.314395] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1022.314395] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1022.314584] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1022.314677] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1022.314847] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1022.315069] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1022.315229] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1022.315397] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1022.315562] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1022.315733] env[59382]: DEBUG nova.virt.hardware [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1022.316633] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9328bcdb-90c7-422a-a165-db46f837c2bd {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.327020] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58dcc2fa-fe01-44c2-9a71-f04f71ffa5a3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.347149] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1022.347149] env[59382]: ERROR nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Traceback (most recent call last): [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] result = getattr(controller, method)(*args, **kwargs) [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self._get(image_id) [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] resp, body = self.http_client.get(url, headers=header) [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self.request(url, 'GET', **kwargs) [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self._handle_response(resp) [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise exc.from_response(resp, resp.content) [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] During handling of the above exception, another exception occurred: [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Traceback (most recent call last): [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] yield resources [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self.driver.spawn(context, instance, image_meta, [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self._fetch_image_if_missing(context, vi) [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] image_fetch(context, vi, tmp_image_ds_loc) [ 1022.347149] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] images.fetch_image( [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] metadata = IMAGE_API.get(context, image_ref) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return session.show(context, image_id, [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] _reraise_translated_image_exception(image_id) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise new_exc.with_traceback(exc_trace) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] result = getattr(controller, method)(*args, **kwargs) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self._get(image_id) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] resp, body = self.http_client.get(url, headers=header) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self.request(url, 'GET', **kwargs) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self._handle_response(resp) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise exc.from_response(resp, resp.content) [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1022.348186] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1022.348186] env[59382]: INFO nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Terminating instance [ 1022.348950] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1022.348950] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1022.349618] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1022.349825] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1022.351206] env[59382]: INFO nova.compute.claims [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1022.353357] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8d89cc35-34a2-4d6d-9c32-d5c7dc0e7b09 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.358268] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1022.358268] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1022.358268] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ace52c9-1c3d-46c9-af22-0eb251e6083b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.363913] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1022.365031] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b02c38e1-f683-486e-a266-1265e2084166 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.366464] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1022.366633] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1022.367368] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ac996cfa-205c-415b-9e25-6cac81405a22 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.372084] env[59382]: DEBUG oslo_vmware.api [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Waiting for the task: (returnval){ [ 1022.372084] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52348a37-9e24-016f-985e-b4ad2dbdfefb" [ 1022.372084] env[59382]: _type = "Task" [ 1022.372084] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1022.379680] env[59382]: DEBUG oslo_vmware.api [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52348a37-9e24-016f-985e-b4ad2dbdfefb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1022.427783] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1022.427986] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1022.428188] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Deleting the datastore file [datastore1] 3c235411-c50f-40b5-a681-ca42b7838506 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1022.428436] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fdb35d53-c815-4dbf-a2a8-200ad2519cc4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.438056] env[59382]: DEBUG oslo_vmware.api [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Waiting for the task: (returnval){ [ 1022.438056] env[59382]: value = "task-2256775" [ 1022.438056] env[59382]: _type = "Task" [ 1022.438056] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1022.445516] env[59382]: DEBUG oslo_vmware.api [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Task: {'id': task-2256775, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1022.538169] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82c8beac-4e59-46d5-9997-6b157588fa41 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.545632] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c9696a4-3314-4b1b-b571-a5a674ea85f2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.579075] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05760e3d-f199-47d1-8eb7-fa819b7e7751 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.586354] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94c02af5-cb19-43db-99a9-b3938fe1ad4c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.606018] env[59382]: DEBUG nova.compute.provider_tree [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1022.617019] env[59382]: DEBUG nova.scheduler.client.report [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1022.638217] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.288s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1022.638719] env[59382]: DEBUG nova.compute.manager [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1022.677137] env[59382]: DEBUG nova.network.neutron [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Successfully created port: a4f52aeb-2cd7-4423-a960-0819eb5ab3a9 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1022.679964] env[59382]: DEBUG nova.compute.utils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1022.682649] env[59382]: DEBUG nova.compute.manager [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1022.683031] env[59382]: DEBUG nova.network.neutron [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1022.692018] env[59382]: DEBUG nova.compute.manager [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1022.763198] env[59382]: DEBUG nova.compute.manager [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1022.788494] env[59382]: DEBUG nova.policy [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54933574bfc4dafa8ed2bace4597bbd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec3ea647eaf467cbd033067f6e4dbfa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 1022.800868] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1022.801134] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1022.801293] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1022.801473] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1022.801622] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1022.801770] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1022.801978] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1022.802293] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1022.802527] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1022.803062] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1022.803261] env[59382]: DEBUG nova.virt.hardware [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1022.804201] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2763cf5-0206-4fb9-9348-697dc793f596 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.812494] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61e5a06d-0275-4e39-b607-4d64ff7c1892 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.881688] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1022.881978] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Creating directory with path [datastore1] vmware_temp/cd82d364-3723-4c5f-8b28-6e194b4b3330/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1022.882181] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-14f6ed86-e92f-497d-9161-20212874ee3e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.893294] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Created directory with path [datastore1] vmware_temp/cd82d364-3723-4c5f-8b28-6e194b4b3330/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1022.893512] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Fetch image to [datastore1] vmware_temp/cd82d364-3723-4c5f-8b28-6e194b4b3330/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1022.893671] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/cd82d364-3723-4c5f-8b28-6e194b4b3330/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1022.894530] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c6180a5-4630-4222-adb7-7857abd60918 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.901573] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fe9d323-2601-4cdb-9848-94bfdbbd246f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.912254] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3343f611-bb6b-43bf-be64-7283b0ac4147 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.917134] env[59382]: DEBUG nova.compute.manager [req-bf6483cc-fbba-46f2-842a-3c9b7cfb035c req-64ada439-2c5d-47c5-88e0-c60096e983dc service nova] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Received event network-vif-plugged-cf7c2d90-e7e6-4c8e-a1ea-abbebb909958 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1022.917361] env[59382]: DEBUG oslo_concurrency.lockutils [req-bf6483cc-fbba-46f2-842a-3c9b7cfb035c req-64ada439-2c5d-47c5-88e0-c60096e983dc service nova] Acquiring lock "2dfb7e00-fea7-4186-914a-98e1e5fbe49a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1022.917857] env[59382]: DEBUG oslo_concurrency.lockutils [req-bf6483cc-fbba-46f2-842a-3c9b7cfb035c req-64ada439-2c5d-47c5-88e0-c60096e983dc service nova] Lock "2dfb7e00-fea7-4186-914a-98e1e5fbe49a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1022.917994] env[59382]: DEBUG oslo_concurrency.lockutils [req-bf6483cc-fbba-46f2-842a-3c9b7cfb035c req-64ada439-2c5d-47c5-88e0-c60096e983dc service nova] Lock "2dfb7e00-fea7-4186-914a-98e1e5fbe49a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1022.918175] env[59382]: DEBUG nova.compute.manager [req-bf6483cc-fbba-46f2-842a-3c9b7cfb035c req-64ada439-2c5d-47c5-88e0-c60096e983dc service nova] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] No waiting events found dispatching network-vif-plugged-cf7c2d90-e7e6-4c8e-a1ea-abbebb909958 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1022.918337] env[59382]: WARNING nova.compute.manager [req-bf6483cc-fbba-46f2-842a-3c9b7cfb035c req-64ada439-2c5d-47c5-88e0-c60096e983dc service nova] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Received unexpected event network-vif-plugged-cf7c2d90-e7e6-4c8e-a1ea-abbebb909958 for instance with vm_state building and task_state spawning. [ 1022.947423] env[59382]: DEBUG nova.network.neutron [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Successfully updated port: cf7c2d90-e7e6-4c8e-a1ea-abbebb909958 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1022.953219] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a97f0230-7048-4271-ba69-f8de71421044 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.963159] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0ee774f6-bcb9-4ccd-a4ef-3bba9b83b327 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.963893] env[59382]: DEBUG oslo_vmware.api [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Task: {'id': task-2256775, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072215} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1022.964479] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "refresh_cache-2dfb7e00-fea7-4186-914a-98e1e5fbe49a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1022.964479] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquired lock "refresh_cache-2dfb7e00-fea7-4186-914a-98e1e5fbe49a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1022.964631] env[59382]: DEBUG nova.network.neutron [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1022.965905] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1022.965969] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1022.966158] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1022.966348] env[59382]: INFO nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1022.968926] env[59382]: DEBUG nova.compute.claims [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1022.968926] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1022.968926] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1022.987322] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1022.995740] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1022.996578] env[59382]: DEBUG nova.compute.utils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Instance 3c235411-c50f-40b5-a681-ca42b7838506 could not be found. {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1022.999885] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Instance disappeared during build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1023.000068] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1023.000247] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1023.000393] env[59382]: DEBUG nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1023.000545] env[59382]: DEBUG nova.network.neutron [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1023.003388] env[59382]: DEBUG nova.network.neutron [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1023.035821] env[59382]: DEBUG neutronclient.v2_0.client [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59382) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1023.037441] env[59382]: ERROR nova.compute.manager [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Traceback (most recent call last): [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] result = getattr(controller, method)(*args, **kwargs) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self._get(image_id) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] resp, body = self.http_client.get(url, headers=header) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self.request(url, 'GET', **kwargs) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self._handle_response(resp) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise exc.from_response(resp, resp.content) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] During handling of the above exception, another exception occurred: [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Traceback (most recent call last): [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self.driver.spawn(context, instance, image_meta, [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self._fetch_image_if_missing(context, vi) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] image_fetch(context, vi, tmp_image_ds_loc) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] images.fetch_image( [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] metadata = IMAGE_API.get(context, image_ref) [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return session.show(context, image_id, [ 1023.037441] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] _reraise_translated_image_exception(image_id) [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise new_exc.with_traceback(exc_trace) [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] result = getattr(controller, method)(*args, **kwargs) [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self._get(image_id) [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] resp, body = self.http_client.get(url, headers=header) [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self.request(url, 'GET', **kwargs) [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self._handle_response(resp) [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise exc.from_response(resp, resp.content) [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] During handling of the above exception, another exception occurred: [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Traceback (most recent call last): [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self._build_and_run_instance(context, instance, image, [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] with excutils.save_and_reraise_exception(): [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self.force_reraise() [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise self.value [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] with self.rt.instance_claim(context, instance, node, allocs, [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self.abort() [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1023.038630] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return f(*args, **kwargs) [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self._unset_instance_host_and_node(instance) [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] instance.save() [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] updates, result = self.indirection_api.object_action( [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return cctxt.call(context, 'object_action', objinst=objinst, [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] result = self.transport._send( [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self._driver.send(target, ctxt, message, [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise result [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] nova.exception_Remote.InstanceNotFound_Remote: Instance 3c235411-c50f-40b5-a681-ca42b7838506 could not be found. [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Traceback (most recent call last): [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return getattr(target, method)(*args, **kwargs) [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return fn(self, *args, **kwargs) [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] old_ref, inst_ref = db.instance_update_and_get_original( [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return f(*args, **kwargs) [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] with excutils.save_and_reraise_exception() as ectxt: [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self.force_reraise() [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise self.value [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return f(*args, **kwargs) [ 1023.039622] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return f(context, *args, **kwargs) [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise exception.InstanceNotFound(instance_id=uuid) [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] nova.exception.InstanceNotFound: Instance 3c235411-c50f-40b5-a681-ca42b7838506 could not be found. [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] During handling of the above exception, another exception occurred: [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Traceback (most recent call last): [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] ret = obj(*args, **kwargs) [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] exception_handler_v20(status_code, error_body) [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise client_exc(message=error_message, [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Neutron server returns request_ids: ['req-38b7f6e1-88fd-46df-a378-242257d5d86b'] [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] During handling of the above exception, another exception occurred: [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Traceback (most recent call last): [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self._deallocate_network(context, instance, requested_networks) [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self.network_api.deallocate_for_instance( [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] data = neutron.list_ports(**search_opts) [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] ret = obj(*args, **kwargs) [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self.list('ports', self.ports_path, retrieve_all, [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] ret = obj(*args, **kwargs) [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1023.040607] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] for r in self._pagination(collection, path, **params): [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] res = self.get(path, params=params) [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] ret = obj(*args, **kwargs) [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self.retry_request("GET", action, body=body, [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] ret = obj(*args, **kwargs) [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] return self.do_request(method, action, body=body, [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] ret = obj(*args, **kwargs) [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] self._handle_fault_response(status_code, replybody, resp) [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] raise exception.Unauthorized() [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] nova.exception.Unauthorized: Not authorized. [ 1023.047102] env[59382]: ERROR nova.compute.manager [instance: 3c235411-c50f-40b5-a681-ca42b7838506] [ 1023.066969] env[59382]: DEBUG oslo_concurrency.lockutils [None req-38a03f50-f91b-4afd-bade-7a7ba7f3da3b tempest-AttachVolumeNegativeTest-1025533387 tempest-AttachVolumeNegativeTest-1025533387-project-member] Lock "3c235411-c50f-40b5-a681-ca42b7838506" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 396.453s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1023.078163] env[59382]: DEBUG nova.compute.manager [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1023.102157] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1023.102939] env[59382]: ERROR nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Traceback (most recent call last): [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] result = getattr(controller, method)(*args, **kwargs) [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self._get(image_id) [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] resp, body = self.http_client.get(url, headers=header) [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self.request(url, 'GET', **kwargs) [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self._handle_response(resp) [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise exc.from_response(resp, resp.content) [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] During handling of the above exception, another exception occurred: [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Traceback (most recent call last): [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] yield resources [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self.driver.spawn(context, instance, image_meta, [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self._fetch_image_if_missing(context, vi) [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] image_fetch(context, vi, tmp_image_ds_loc) [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] images.fetch_image( [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1023.102939] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] metadata = IMAGE_API.get(context, image_ref) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return session.show(context, image_id, [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] _reraise_translated_image_exception(image_id) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise new_exc.with_traceback(exc_trace) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] result = getattr(controller, method)(*args, **kwargs) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self._get(image_id) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] resp, body = self.http_client.get(url, headers=header) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self.request(url, 'GET', **kwargs) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self._handle_response(resp) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise exc.from_response(resp, resp.content) [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1023.103918] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.103918] env[59382]: INFO nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Terminating instance [ 1023.105192] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1023.105428] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1023.106289] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1023.111641] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1023.111910] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3831d3a7-cfc7-4c6d-b2e2-45e33568d032 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.115593] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-011c05a0-9593-4100-bcee-750645d2e702 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.128371] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1023.130022] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-55ad63a4-e904-4499-9555-a0f87bda56c2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.131093] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1023.131351] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1023.131940] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-95e13c61-40f0-47f1-929d-60893ab58abb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.136795] env[59382]: DEBUG oslo_vmware.api [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Waiting for the task: (returnval){ [ 1023.136795] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52872ccc-c85e-06f4-07fb-11223a2a6503" [ 1023.136795] env[59382]: _type = "Task" [ 1023.136795] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1023.137636] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1023.137862] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1023.139335] env[59382]: INFO nova.compute.claims [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1023.149065] env[59382]: DEBUG oslo_vmware.api [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52872ccc-c85e-06f4-07fb-11223a2a6503, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1023.194322] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1023.194544] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1023.194701] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Deleting the datastore file [datastore1] acae2ecc-9a00-4356-96d7-a7521ea46f32 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1023.194949] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3c7578bb-d37f-4ccf-bca8-4ab13f36425e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.201458] env[59382]: DEBUG oslo_vmware.api [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Waiting for the task: (returnval){ [ 1023.201458] env[59382]: value = "task-2256777" [ 1023.201458] env[59382]: _type = "Task" [ 1023.201458] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1023.211542] env[59382]: DEBUG oslo_vmware.api [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Task: {'id': task-2256777, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1023.346991] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16da5ae1-fcfb-4a97-8b76-29e7999e0e40 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.355883] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b8367ea-812c-4727-a0c5-4ed7919ee33f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.385803] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e288e3c-a07a-41b6-a3e1-f3e1e5b232b4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.388013] env[59382]: DEBUG nova.network.neutron [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Successfully created port: 72e7e0b8-4690-4d93-9e83-3da57109f5d6 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1023.394466] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-904a6a7e-283c-42f8-bd03-5bebd320bc72 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.407802] env[59382]: DEBUG nova.compute.provider_tree [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1023.418484] env[59382]: DEBUG nova.scheduler.client.report [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1023.425735] env[59382]: DEBUG nova.network.neutron [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Updating instance_info_cache with network_info: [{"id": "cf7c2d90-e7e6-4c8e-a1ea-abbebb909958", "address": "fa:16:3e:a3:82:8b", "network": {"id": "c8d90262-c881-4516-a6e9-4cd30716b02b", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1874138198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ec3ea647eaf467cbd033067f6e4dbfa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf7c2d90-e7", "ovs_interfaceid": "cf7c2d90-e7e6-4c8e-a1ea-abbebb909958", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1023.433997] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.296s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1023.434376] env[59382]: DEBUG nova.compute.manager [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1023.437388] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Releasing lock "refresh_cache-2dfb7e00-fea7-4186-914a-98e1e5fbe49a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1023.437599] env[59382]: DEBUG nova.compute.manager [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Instance network_info: |[{"id": "cf7c2d90-e7e6-4c8e-a1ea-abbebb909958", "address": "fa:16:3e:a3:82:8b", "network": {"id": "c8d90262-c881-4516-a6e9-4cd30716b02b", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1874138198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ec3ea647eaf467cbd033067f6e4dbfa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf7c2d90-e7", "ovs_interfaceid": "cf7c2d90-e7e6-4c8e-a1ea-abbebb909958", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1023.438134] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a3:82:8b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '47499d09-8010-4d02-ac96-4f057c104692', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cf7c2d90-e7e6-4c8e-a1ea-abbebb909958', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1023.445385] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Creating folder: Project (7ec3ea647eaf467cbd033067f6e4dbfa). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1023.445840] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c1d5406b-e660-42a9-a9cb-06556d6c3568 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.456911] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Created folder: Project (7ec3ea647eaf467cbd033067f6e4dbfa) in parent group-v459741. [ 1023.457102] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Creating folder: Instances. Parent ref: group-v459803. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1023.457356] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-717a5a7c-4e02-43c9-bb99-9856f99b8102 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.465896] env[59382]: DEBUG nova.compute.utils [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1023.468419] env[59382]: DEBUG nova.compute.manager [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1023.468619] env[59382]: DEBUG nova.network.neutron [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1023.473019] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Created folder: Instances in parent group-v459803. [ 1023.473019] env[59382]: DEBUG oslo.service.loopingcall [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1023.473019] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1023.473019] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6893a9cd-6441-48f3-bc36-eb03340484e8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.486792] env[59382]: DEBUG nova.compute.manager [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1023.494730] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1023.494730] env[59382]: value = "task-2256780" [ 1023.494730] env[59382]: _type = "Task" [ 1023.494730] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1023.502789] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256780, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1023.521783] env[59382]: INFO nova.virt.block_device [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Booting with volume 21aaabc8-34b1-426a-b9f5-7f97f0fe6128 at /dev/sda [ 1023.568757] env[59382]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bdbcfb12-7c11-4a22-a3c2-040f847d3b37 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.577793] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60f93000-ca49-4a8c-9497-0f64ef0567b7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.604245] env[59382]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0fd1eee8-fd7a-40d9-a958-069b0a2a8115 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.612683] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ad96400-b880-4641-81b4-97c105ec8351 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.650838] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c133b0df-5e0e-41a6-b142-fbf28de85c45 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.667279] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1023.668968] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Creating directory with path [datastore1] vmware_temp/51c837b8-bf18-4103-bcda-509b99a3c07d/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1023.668968] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-76442756-d322-4e36-bfdf-58f74b547594 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.672957] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-730f155b-8c26-4135-8704-098897ac48f4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.692124] env[59382]: DEBUG nova.virt.block_device [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Updating existing volume attachment record: 126d0df8-8fff-41fa-b8cf-8c49efac2e46 {{(pid=59382) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1023.697414] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Created directory with path [datastore1] vmware_temp/51c837b8-bf18-4103-bcda-509b99a3c07d/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1023.697771] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Fetch image to [datastore1] vmware_temp/51c837b8-bf18-4103-bcda-509b99a3c07d/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1023.698168] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/51c837b8-bf18-4103-bcda-509b99a3c07d/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1023.699477] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d28382a4-b3de-4a0c-b5b2-944ac46be14e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.713674] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6da731bb-4757-43e6-b8e2-bf22a4754f69 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.725447] env[59382]: DEBUG oslo_vmware.api [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Task: {'id': task-2256777, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072319} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1023.726301] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1023.726638] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1023.727220] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1023.727369] env[59382]: INFO nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1023.736915] env[59382]: DEBUG nova.compute.claims [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1023.737219] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1023.737585] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1023.745021] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c04650b-31a4-4b48-b2bb-4fe30707a685 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.784221] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dbc177f-546a-471e-9028-ce0212242362 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.787056] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.049s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1023.787782] env[59382]: DEBUG nova.compute.utils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Instance acae2ecc-9a00-4356-96d7-a7521ea46f32 could not be found. {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1023.789262] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Instance disappeared during build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1023.789472] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1023.789669] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1023.789865] env[59382]: DEBUG nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1023.790069] env[59382]: DEBUG nova.network.neutron [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1023.795113] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-526de32e-edf6-46f1-92a3-b82b6bca7985 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.816446] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1023.852654] env[59382]: DEBUG nova.policy [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '10ae8c0465b44939949b0e99bc65d9a3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fe7ee8ba08b5483dba2ba9f7f0851021', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 1023.961499] env[59382]: DEBUG neutronclient.v2_0.client [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59382) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1023.963045] env[59382]: ERROR nova.compute.manager [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Traceback (most recent call last): [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] result = getattr(controller, method)(*args, **kwargs) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self._get(image_id) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] resp, body = self.http_client.get(url, headers=header) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self.request(url, 'GET', **kwargs) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self._handle_response(resp) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise exc.from_response(resp, resp.content) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] During handling of the above exception, another exception occurred: [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Traceback (most recent call last): [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self.driver.spawn(context, instance, image_meta, [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self._fetch_image_if_missing(context, vi) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] image_fetch(context, vi, tmp_image_ds_loc) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] images.fetch_image( [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] metadata = IMAGE_API.get(context, image_ref) [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return session.show(context, image_id, [ 1023.963045] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] _reraise_translated_image_exception(image_id) [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise new_exc.with_traceback(exc_trace) [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] result = getattr(controller, method)(*args, **kwargs) [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self._get(image_id) [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] resp, body = self.http_client.get(url, headers=header) [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self.request(url, 'GET', **kwargs) [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self._handle_response(resp) [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise exc.from_response(resp, resp.content) [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] During handling of the above exception, another exception occurred: [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Traceback (most recent call last): [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self._build_and_run_instance(context, instance, image, [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] with excutils.save_and_reraise_exception(): [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self.force_reraise() [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise self.value [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] with self.rt.instance_claim(context, instance, node, allocs, [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self.abort() [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1023.964191] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return f(*args, **kwargs) [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self._unset_instance_host_and_node(instance) [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] instance.save() [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] updates, result = self.indirection_api.object_action( [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return cctxt.call(context, 'object_action', objinst=objinst, [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] result = self.transport._send( [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self._driver.send(target, ctxt, message, [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise result [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] nova.exception_Remote.InstanceNotFound_Remote: Instance acae2ecc-9a00-4356-96d7-a7521ea46f32 could not be found. [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Traceback (most recent call last): [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return getattr(target, method)(*args, **kwargs) [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return fn(self, *args, **kwargs) [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] old_ref, inst_ref = db.instance_update_and_get_original( [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return f(*args, **kwargs) [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] with excutils.save_and_reraise_exception() as ectxt: [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self.force_reraise() [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise self.value [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return f(*args, **kwargs) [ 1023.965126] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return f(context, *args, **kwargs) [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise exception.InstanceNotFound(instance_id=uuid) [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] nova.exception.InstanceNotFound: Instance acae2ecc-9a00-4356-96d7-a7521ea46f32 could not be found. [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] During handling of the above exception, another exception occurred: [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Traceback (most recent call last): [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] ret = obj(*args, **kwargs) [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] exception_handler_v20(status_code, error_body) [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise client_exc(message=error_message, [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Neutron server returns request_ids: ['req-e686d27d-ff8a-4a90-934b-0656870e4549'] [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] During handling of the above exception, another exception occurred: [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Traceback (most recent call last): [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self._deallocate_network(context, instance, requested_networks) [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self.network_api.deallocate_for_instance( [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] data = neutron.list_ports(**search_opts) [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] ret = obj(*args, **kwargs) [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self.list('ports', self.ports_path, retrieve_all, [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] ret = obj(*args, **kwargs) [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1023.966193] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] for r in self._pagination(collection, path, **params): [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] res = self.get(path, params=params) [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] ret = obj(*args, **kwargs) [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self.retry_request("GET", action, body=body, [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] ret = obj(*args, **kwargs) [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] return self.do_request(method, action, body=body, [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] ret = obj(*args, **kwargs) [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] self._handle_fault_response(status_code, replybody, resp) [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] raise exception.Unauthorized() [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] nova.exception.Unauthorized: Not authorized. [ 1023.970187] env[59382]: ERROR nova.compute.manager [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] [ 1023.991682] env[59382]: DEBUG oslo_concurrency.lockutils [None req-a70fd6bb-5f9f-4e21-a965-ddceee856ae1 tempest-AttachVolumeShelveTestJSON-1185374599 tempest-AttachVolumeShelveTestJSON-1185374599-project-member] Lock "acae2ecc-9a00-4356-96d7-a7521ea46f32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 393.048s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1024.000287] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1024.001049] env[59382]: ERROR nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Traceback (most recent call last): [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] result = getattr(controller, method)(*args, **kwargs) [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self._get(image_id) [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] resp, body = self.http_client.get(url, headers=header) [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self.request(url, 'GET', **kwargs) [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self._handle_response(resp) [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise exc.from_response(resp, resp.content) [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] During handling of the above exception, another exception occurred: [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Traceback (most recent call last): [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] yield resources [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self.driver.spawn(context, instance, image_meta, [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self._fetch_image_if_missing(context, vi) [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] image_fetch(context, vi, tmp_image_ds_loc) [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] images.fetch_image( [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1024.001049] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] metadata = IMAGE_API.get(context, image_ref) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return session.show(context, image_id, [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] _reraise_translated_image_exception(image_id) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise new_exc.with_traceback(exc_trace) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] result = getattr(controller, method)(*args, **kwargs) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self._get(image_id) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] resp, body = self.http_client.get(url, headers=header) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self.request(url, 'GET', **kwargs) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self._handle_response(resp) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise exc.from_response(resp, resp.content) [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1024.002082] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.002082] env[59382]: INFO nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Terminating instance [ 1024.005621] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.005621] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1024.005621] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7f76b828-64dd-4dbe-9f2e-0a941256830a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.010766] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1024.010976] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1024.011217] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256780, 'name': CreateVM_Task} progress is 99%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1024.011907] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2442a02-e803-43a6-8c87-f2ba6c5727b2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.016521] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1024.030017] env[59382]: DEBUG nova.compute.manager [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1024.030764] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1024.031056] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1024.031178] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1024.031359] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1024.031505] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1024.031649] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1024.033061] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1024.033061] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1024.033061] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1024.033061] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1024.033061] env[59382]: DEBUG nova.virt.hardware [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1024.033260] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1024.033307] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1024.034580] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1024.035457] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1862a496-5517-435b-9e8d-3be322a9bd1b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.038430] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a3149f6d-db29-4188-a20f-c54787e0c6b1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.041436] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-24c6b644-6fae-4c76-99f4-14b939f2496b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.045774] env[59382]: DEBUG nova.compute.manager [req-9c24df3c-95b5-4a9f-8e02-5235c0aec348 req-eb95ac15-4171-45f8-9fc9-7e6989d54b40 service nova] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Received event network-vif-plugged-a4f52aeb-2cd7-4423-a960-0819eb5ab3a9 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1024.045977] env[59382]: DEBUG oslo_concurrency.lockutils [req-9c24df3c-95b5-4a9f-8e02-5235c0aec348 req-eb95ac15-4171-45f8-9fc9-7e6989d54b40 service nova] Acquiring lock "81f08c14-ee4b-4954-bf53-dc02bb600279-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1024.046192] env[59382]: DEBUG oslo_concurrency.lockutils [req-9c24df3c-95b5-4a9f-8e02-5235c0aec348 req-eb95ac15-4171-45f8-9fc9-7e6989d54b40 service nova] Lock "81f08c14-ee4b-4954-bf53-dc02bb600279-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1024.046383] env[59382]: DEBUG oslo_concurrency.lockutils [req-9c24df3c-95b5-4a9f-8e02-5235c0aec348 req-eb95ac15-4171-45f8-9fc9-7e6989d54b40 service nova] Lock "81f08c14-ee4b-4954-bf53-dc02bb600279-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1024.046546] env[59382]: DEBUG nova.compute.manager [req-9c24df3c-95b5-4a9f-8e02-5235c0aec348 req-eb95ac15-4171-45f8-9fc9-7e6989d54b40 service nova] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] No waiting events found dispatching network-vif-plugged-a4f52aeb-2cd7-4423-a960-0819eb5ab3a9 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1024.046711] env[59382]: WARNING nova.compute.manager [req-9c24df3c-95b5-4a9f-8e02-5235c0aec348 req-eb95ac15-4171-45f8-9fc9-7e6989d54b40 service nova] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Received unexpected event network-vif-plugged-a4f52aeb-2cd7-4423-a960-0819eb5ab3a9 for instance with vm_state building and task_state spawning. [ 1024.058717] env[59382]: DEBUG oslo_vmware.api [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Waiting for the task: (returnval){ [ 1024.058717] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52708015-af82-2b08-3d83-1660e79d3cbe" [ 1024.058717] env[59382]: _type = "Task" [ 1024.058717] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1024.062211] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e05a6f36-26b1-4822-90a7-19b421d70ce7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.082256] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1024.082490] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1024.083932] env[59382]: INFO nova.compute.claims [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1024.086247] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1024.086484] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Creating directory with path [datastore1] vmware_temp/340bfd1f-1b44-4eae-9c4e-9e84acc0df97/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1024.087099] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7aec8f3c-0391-4212-a54e-279dc202c521 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.106669] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Created directory with path [datastore1] vmware_temp/340bfd1f-1b44-4eae-9c4e-9e84acc0df97/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1024.106882] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Fetch image to [datastore1] vmware_temp/340bfd1f-1b44-4eae-9c4e-9e84acc0df97/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1024.107221] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/340bfd1f-1b44-4eae-9c4e-9e84acc0df97/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1024.107878] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9df1f3a9-1ab7-42cf-bee5-c721c5c93922 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.115705] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5d4eda8-d498-45d4-bc19-3e732b7d05a4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.119738] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1024.120172] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1024.120265] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Deleting the datastore file [datastore1] cf672665-36c7-4251-a32a-537b9d4c38ed {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1024.121488] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6ba055f8-9c2e-4d21-94f1-78e8a7234ea0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.132837] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8710b416-7557-4f4d-8651-ebabad6e6205 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.137265] env[59382]: DEBUG oslo_vmware.api [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Waiting for the task: (returnval){ [ 1024.137265] env[59382]: value = "task-2256782" [ 1024.137265] env[59382]: _type = "Task" [ 1024.137265] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1024.176698] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9089c7d7-332f-483d-aadb-39342d341888 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.182751] env[59382]: DEBUG oslo_vmware.api [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Task: {'id': task-2256782, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1024.186257] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-625d5755-5ff0-496b-a1c2-2da1d97c613d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.212874] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1024.221889] env[59382]: DEBUG nova.network.neutron [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Successfully updated port: a4f52aeb-2cd7-4423-a960-0819eb5ab3a9 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1024.235528] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "refresh_cache-81f08c14-ee4b-4954-bf53-dc02bb600279" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1024.235757] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquired lock "refresh_cache-81f08c14-ee4b-4954-bf53-dc02bb600279" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.235959] env[59382]: DEBUG nova.network.neutron [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1024.308982] env[59382]: DEBUG nova.network.neutron [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1024.312234] env[59382]: DEBUG nova.network.neutron [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Successfully created port: e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1024.314366] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96334426-3926-4142-b2f0-c1d9e8c022be {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.322268] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5100a064-27f1-4abb-b29d-ac05d1628dee {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.351768] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bc02e8f-436b-48f5-a255-48e784ac2cc0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.359641] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-685b17b8-2e1f-4fa5-ade6-15571b766b6d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.375120] env[59382]: DEBUG nova.compute.provider_tree [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1024.377082] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1024.377788] env[59382]: ERROR nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Traceback (most recent call last): [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] result = getattr(controller, method)(*args, **kwargs) [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self._get(image_id) [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] resp, body = self.http_client.get(url, headers=header) [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self.request(url, 'GET', **kwargs) [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self._handle_response(resp) [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise exc.from_response(resp, resp.content) [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] During handling of the above exception, another exception occurred: [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Traceback (most recent call last): [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] yield resources [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self.driver.spawn(context, instance, image_meta, [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self._fetch_image_if_missing(context, vi) [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] image_fetch(context, vi, tmp_image_ds_loc) [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] images.fetch_image( [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1024.377788] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] metadata = IMAGE_API.get(context, image_ref) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return session.show(context, image_id, [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] _reraise_translated_image_exception(image_id) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise new_exc.with_traceback(exc_trace) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] result = getattr(controller, method)(*args, **kwargs) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self._get(image_id) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] resp, body = self.http_client.get(url, headers=header) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self.request(url, 'GET', **kwargs) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self._handle_response(resp) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise exc.from_response(resp, resp.content) [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1024.381106] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1024.381106] env[59382]: INFO nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Terminating instance [ 1024.381106] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.381106] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1024.381106] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1024.381106] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquired lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.381106] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1024.381986] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ed4ff2c7-2fb3-4db4-98ba-508c487fe464 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.384610] env[59382]: DEBUG nova.scheduler.client.report [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1024.389658] env[59382]: DEBUG nova.compute.utils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Can not refresh info_cache because instance was not found {{(pid=59382) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1024.395250] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1024.395332] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1024.396044] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-05d2abd4-5f01-41c3-a032-fc01088a2b52 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.399191] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.317s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1024.399599] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1024.406374] env[59382]: DEBUG oslo_vmware.api [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Waiting for the task: (returnval){ [ 1024.406374] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]521a9a03-d338-5f49-e384-87f6f75d6662" [ 1024.406374] env[59382]: _type = "Task" [ 1024.406374] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1024.414244] env[59382]: DEBUG oslo_vmware.api [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]521a9a03-d338-5f49-e384-87f6f75d6662, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1024.418110] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1024.438075] env[59382]: DEBUG nova.compute.utils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1024.439430] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1024.439616] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1024.450437] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1024.505301] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256780, 'name': CreateVM_Task, 'duration_secs': 0.749639} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1024.505472] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1024.506109] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1024.506274] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.506685] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1024.506857] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-35ed58b2-7342-43d5-b9d5-4ecc2c839b08 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.511645] env[59382]: DEBUG oslo_vmware.api [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Waiting for the task: (returnval){ [ 1024.511645] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52a53b5c-9ceb-2d5b-f727-5a435d133d39" [ 1024.511645] env[59382]: _type = "Task" [ 1024.511645] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1024.519257] env[59382]: DEBUG oslo_vmware.api [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52a53b5c-9ceb-2d5b-f727-5a435d133d39, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1024.520671] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1024.544878] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1024.545223] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1024.545307] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1024.545451] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1024.545621] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1024.545728] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1024.545935] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1024.546101] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1024.546283] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1024.546460] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1024.546639] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1024.547528] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf5427dd-78d1-4824-871e-93c5d07b3fe4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.557742] env[59382]: DEBUG nova.policy [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd6a3da0a267418dac77b2800f2ecf7e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dac32643412b44dfa237f923136bb01b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 1024.564172] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09877d6b-d190-41d8-8b23-aec2b3689003 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.596733] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1024.605614] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Releasing lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1024.605998] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1024.606207] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1024.607213] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce786acf-c3a2-4380-b0f2-5bdb6b37a194 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.616913] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1024.617154] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2855fafa-3a76-435c-bce3-17c323e5b2b2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.645957] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1024.646190] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1024.646443] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Deleting the datastore file [datastore1] d3d59ff4-eaa9-46b3-8279-50e5cfe740a0 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1024.646972] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-eb0b71b6-5e29-4aed-9513-c029b72fa338 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.651653] env[59382]: DEBUG oslo_vmware.api [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Task: {'id': task-2256782, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066605} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1024.652693] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1024.652870] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1024.653054] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1024.653226] env[59382]: INFO nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1024.654787] env[59382]: DEBUG oslo_vmware.api [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Waiting for the task: (returnval){ [ 1024.654787] env[59382]: value = "task-2256784" [ 1024.654787] env[59382]: _type = "Task" [ 1024.654787] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1024.655205] env[59382]: DEBUG nova.compute.claims [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1024.655365] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1024.655571] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1024.665161] env[59382]: DEBUG oslo_vmware.api [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Task: {'id': task-2256784, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1024.683153] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1024.683886] env[59382]: DEBUG nova.compute.utils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Instance cf672665-36c7-4251-a32a-537b9d4c38ed could not be found. {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1024.685344] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Instance disappeared during build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1024.685510] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1024.685670] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1024.685820] env[59382]: DEBUG nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1024.685977] env[59382]: DEBUG nova.network.neutron [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1024.698807] env[59382]: DEBUG nova.network.neutron [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Updating instance_info_cache with network_info: [{"id": "a4f52aeb-2cd7-4423-a960-0819eb5ab3a9", "address": "fa:16:3e:3a:d8:f5", "network": {"id": "c8d90262-c881-4516-a6e9-4cd30716b02b", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1874138198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ec3ea647eaf467cbd033067f6e4dbfa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa4f52aeb-2c", "ovs_interfaceid": "a4f52aeb-2cd7-4423-a960-0819eb5ab3a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1024.722203] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Releasing lock "refresh_cache-81f08c14-ee4b-4954-bf53-dc02bb600279" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1024.722203] env[59382]: DEBUG nova.compute.manager [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Instance network_info: |[{"id": "a4f52aeb-2cd7-4423-a960-0819eb5ab3a9", "address": "fa:16:3e:3a:d8:f5", "network": {"id": "c8d90262-c881-4516-a6e9-4cd30716b02b", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1874138198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ec3ea647eaf467cbd033067f6e4dbfa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa4f52aeb-2c", "ovs_interfaceid": "a4f52aeb-2cd7-4423-a960-0819eb5ab3a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1024.722401] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3a:d8:f5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '47499d09-8010-4d02-ac96-4f057c104692', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a4f52aeb-2cd7-4423-a960-0819eb5ab3a9', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1024.730164] env[59382]: DEBUG oslo.service.loopingcall [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1024.730894] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1024.731240] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-938c1c00-d4ad-456c-a0fc-f69ee4b6242e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.751224] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1024.751224] env[59382]: value = "task-2256785" [ 1024.751224] env[59382]: _type = "Task" [ 1024.751224] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1024.760165] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256785, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1024.831372] env[59382]: DEBUG nova.network.neutron [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Successfully updated port: 72e7e0b8-4690-4d93-9e83-3da57109f5d6 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1024.835304] env[59382]: DEBUG neutronclient.v2_0.client [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59382) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1024.836936] env[59382]: ERROR nova.compute.manager [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Traceback (most recent call last): [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] result = getattr(controller, method)(*args, **kwargs) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self._get(image_id) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] resp, body = self.http_client.get(url, headers=header) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self.request(url, 'GET', **kwargs) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self._handle_response(resp) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise exc.from_response(resp, resp.content) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] During handling of the above exception, another exception occurred: [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Traceback (most recent call last): [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self.driver.spawn(context, instance, image_meta, [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self._fetch_image_if_missing(context, vi) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] image_fetch(context, vi, tmp_image_ds_loc) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] images.fetch_image( [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] metadata = IMAGE_API.get(context, image_ref) [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return session.show(context, image_id, [ 1024.836936] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] _reraise_translated_image_exception(image_id) [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise new_exc.with_traceback(exc_trace) [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] result = getattr(controller, method)(*args, **kwargs) [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self._get(image_id) [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] resp, body = self.http_client.get(url, headers=header) [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self.request(url, 'GET', **kwargs) [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self._handle_response(resp) [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise exc.from_response(resp, resp.content) [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] nova.exception.ImageNotAuthorized: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] During handling of the above exception, another exception occurred: [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Traceback (most recent call last): [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self._build_and_run_instance(context, instance, image, [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] with excutils.save_and_reraise_exception(): [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self.force_reraise() [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise self.value [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] with self.rt.instance_claim(context, instance, node, allocs, [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self.abort() [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1024.837975] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return f(*args, **kwargs) [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self._unset_instance_host_and_node(instance) [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] instance.save() [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] updates, result = self.indirection_api.object_action( [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return cctxt.call(context, 'object_action', objinst=objinst, [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] result = self.transport._send( [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self._driver.send(target, ctxt, message, [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise result [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] nova.exception_Remote.InstanceNotFound_Remote: Instance cf672665-36c7-4251-a32a-537b9d4c38ed could not be found. [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Traceback (most recent call last): [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return getattr(target, method)(*args, **kwargs) [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return fn(self, *args, **kwargs) [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] old_ref, inst_ref = db.instance_update_and_get_original( [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return f(*args, **kwargs) [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] with excutils.save_and_reraise_exception() as ectxt: [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self.force_reraise() [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise self.value [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return f(*args, **kwargs) [ 1024.838999] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return f(context, *args, **kwargs) [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise exception.InstanceNotFound(instance_id=uuid) [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] nova.exception.InstanceNotFound: Instance cf672665-36c7-4251-a32a-537b9d4c38ed could not be found. [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] During handling of the above exception, another exception occurred: [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Traceback (most recent call last): [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] ret = obj(*args, **kwargs) [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] exception_handler_v20(status_code, error_body) [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise client_exc(message=error_message, [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Neutron server returns request_ids: ['req-51605c39-67ac-44ef-97bf-62cb0d3447b2'] [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] During handling of the above exception, another exception occurred: [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Traceback (most recent call last): [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self._deallocate_network(context, instance, requested_networks) [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self.network_api.deallocate_for_instance( [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] data = neutron.list_ports(**search_opts) [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] ret = obj(*args, **kwargs) [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self.list('ports', self.ports_path, retrieve_all, [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] ret = obj(*args, **kwargs) [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1024.840168] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] for r in self._pagination(collection, path, **params): [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] res = self.get(path, params=params) [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] ret = obj(*args, **kwargs) [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self.retry_request("GET", action, body=body, [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] ret = obj(*args, **kwargs) [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] return self.do_request(method, action, body=body, [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] ret = obj(*args, **kwargs) [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] self._handle_fault_response(status_code, replybody, resp) [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] raise exception.Unauthorized() [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] nova.exception.Unauthorized: Not authorized. [ 1024.841122] env[59382]: ERROR nova.compute.manager [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] [ 1024.841122] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "refresh_cache-05e46e58-1de8-48a0-a139-c202d77e85ad" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1024.841122] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquired lock "refresh_cache-05e46e58-1de8-48a0-a139-c202d77e85ad" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.841122] env[59382]: DEBUG nova.network.neutron [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1024.873318] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60f10f42-5dca-4845-9591-547f878def41 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "cf672665-36c7-4251-a32a-537b9d4c38ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 393.919s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1024.883418] env[59382]: DEBUG nova.network.neutron [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1024.886970] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1024.919188] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1024.919601] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Creating directory with path [datastore1] vmware_temp/d42d1721-2804-4f32-bbdf-071094a8a0d6/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1024.919957] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b67a1aa6-ace9-4350-80eb-3bc796ca1e6d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.939189] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Created directory with path [datastore1] vmware_temp/d42d1721-2804-4f32-bbdf-071094a8a0d6/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1024.939572] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Fetch image to [datastore1] vmware_temp/d42d1721-2804-4f32-bbdf-071094a8a0d6/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1024.939888] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/d42d1721-2804-4f32-bbdf-071094a8a0d6/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1024.945029] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a47ea72-02a4-456c-8721-d21d84c546a5 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.951544] env[59382]: DEBUG nova.compute.manager [req-13c430a1-8c46-46c3-a10e-23ac7dc25aa7 req-6cae168e-5a87-4bda-8bac-88a0848bca9e service nova] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Received event network-changed-cf7c2d90-e7e6-4c8e-a1ea-abbebb909958 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1024.951793] env[59382]: DEBUG nova.compute.manager [req-13c430a1-8c46-46c3-a10e-23ac7dc25aa7 req-6cae168e-5a87-4bda-8bac-88a0848bca9e service nova] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Refreshing instance network info cache due to event network-changed-cf7c2d90-e7e6-4c8e-a1ea-abbebb909958. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 1024.952077] env[59382]: DEBUG oslo_concurrency.lockutils [req-13c430a1-8c46-46c3-a10e-23ac7dc25aa7 req-6cae168e-5a87-4bda-8bac-88a0848bca9e service nova] Acquiring lock "refresh_cache-2dfb7e00-fea7-4186-914a-98e1e5fbe49a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1024.952226] env[59382]: DEBUG oslo_concurrency.lockutils [req-13c430a1-8c46-46c3-a10e-23ac7dc25aa7 req-6cae168e-5a87-4bda-8bac-88a0848bca9e service nova] Acquired lock "refresh_cache-2dfb7e00-fea7-4186-914a-98e1e5fbe49a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.952406] env[59382]: DEBUG nova.network.neutron [req-13c430a1-8c46-46c3-a10e-23ac7dc25aa7 req-6cae168e-5a87-4bda-8bac-88a0848bca9e service nova] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Refreshing network info cache for port cf7c2d90-e7e6-4c8e-a1ea-abbebb909958 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1024.958337] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c35ed562-7ce0-418a-a36d-c0c1c94c36ee {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.965880] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1024.966138] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1024.967662] env[59382]: INFO nova.compute.claims [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1024.980875] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aab4662c-9b8f-467f-a9f4-4234bc27a1b2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.986309] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Successfully created port: 4b811be5-378f-4876-aa4d-571a92ce5cf5 {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1025.034047] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29b1ba36-5294-4a1b-94a1-fe08146056c2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.052508] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1025.052815] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1025.053256] env[59382]: DEBUG oslo_concurrency.lockutils [None req-8aeb909d-aad8-4a6c-a52e-229d7c2ed266 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.053332] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aae01e66-db1f-4c2b-9e85-c96f332fdf1e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.074540] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1025.139214] env[59382]: DEBUG oslo_vmware.rw_handles [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d42d1721-2804-4f32-bbdf-071094a8a0d6/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1025.203403] env[59382]: DEBUG oslo_vmware.rw_handles [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1025.203592] env[59382]: DEBUG oslo_vmware.rw_handles [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d42d1721-2804-4f32-bbdf-071094a8a0d6/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1025.207900] env[59382]: DEBUG oslo_vmware.api [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Task: {'id': task-2256784, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.039856} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1025.208162] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1025.208680] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1025.208680] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1025.208680] env[59382]: INFO nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1025.208955] env[59382]: DEBUG oslo.service.loopingcall [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1025.209118] env[59382]: DEBUG nova.compute.manager [-] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1025.209200] env[59382]: DEBUG nova.network.neutron [-] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1025.260874] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256785, 'name': CreateVM_Task, 'duration_secs': 0.288864} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1025.261112] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1025.261782] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.261941] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1025.262264] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1025.262507] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5a5023c6-f716-4562-8a65-8e484ffee98f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.268343] env[59382]: DEBUG oslo_vmware.api [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Waiting for the task: (returnval){ [ 1025.268343] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52e9ff80-8fb6-3cdb-db13-1a99f9af557e" [ 1025.268343] env[59382]: _type = "Task" [ 1025.268343] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1025.275937] env[59382]: DEBUG oslo_vmware.api [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52e9ff80-8fb6-3cdb-db13-1a99f9af557e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1025.277505] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3f464b5-5e47-4b50-8f85-b036b10bcf8c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.285174] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5c37a2a-7eaf-4428-97cf-71ba8fc17f81 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.315075] env[59382]: DEBUG nova.network.neutron [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Updating instance_info_cache with network_info: [{"id": "72e7e0b8-4690-4d93-9e83-3da57109f5d6", "address": "fa:16:3e:7a:91:20", "network": {"id": "c8d90262-c881-4516-a6e9-4cd30716b02b", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1874138198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ec3ea647eaf467cbd033067f6e4dbfa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72e7e0b8-46", "ovs_interfaceid": "72e7e0b8-4690-4d93-9e83-3da57109f5d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1025.319431] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-326a7dfd-9703-43e2-bdbd-763dd0aa0c9d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.328236] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9488eb6-425f-47c1-8bc8-0ac94b479dd7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.333064] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Releasing lock "refresh_cache-05e46e58-1de8-48a0-a139-c202d77e85ad" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1025.333343] env[59382]: DEBUG nova.compute.manager [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Instance network_info: |[{"id": "72e7e0b8-4690-4d93-9e83-3da57109f5d6", "address": "fa:16:3e:7a:91:20", "network": {"id": "c8d90262-c881-4516-a6e9-4cd30716b02b", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1874138198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ec3ea647eaf467cbd033067f6e4dbfa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72e7e0b8-46", "ovs_interfaceid": "72e7e0b8-4690-4d93-9e83-3da57109f5d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1025.334042] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7a:91:20', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '47499d09-8010-4d02-ac96-4f057c104692', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '72e7e0b8-4690-4d93-9e83-3da57109f5d6', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1025.341609] env[59382]: DEBUG oslo.service.loopingcall [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1025.350080] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1025.350559] env[59382]: DEBUG nova.compute.provider_tree [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1025.351641] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0b663a3b-b642-4157-90dc-c808b66654c0 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.368344] env[59382]: DEBUG nova.scheduler.client.report [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1025.378105] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1025.378105] env[59382]: value = "task-2256786" [ 1025.378105] env[59382]: _type = "Task" [ 1025.378105] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1025.382444] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.416s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1025.382954] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1025.390338] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256786, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1025.392993] env[59382]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59382) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1025.393235] env[59382]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-cee97b82-1e32-4481-87a9-4050fc6803e9'] [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 407, in _func [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3041, in _deallocate_network_with_retries [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1025.393931] env[59382]: ERROR oslo.service.loopingcall [ 1025.395217] env[59382]: ERROR nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1025.414676] env[59382]: DEBUG nova.compute.utils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1025.416053] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1025.416249] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1025.419763] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance has been destroyed from under us while trying to set it to ERROR {{(pid=59382) _set_instance_obj_error_state /opt/stack/nova/nova/compute/manager.py:728}} [ 1025.420108] env[59382]: WARNING nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Could not clean up failed build, not rescheduling. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1025.420371] env[59382]: DEBUG nova.compute.claims [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1025.420562] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1025.420816] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1025.427382] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1025.451684] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.031s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1025.452879] env[59382]: DEBUG nova.compute.utils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance d3d59ff4-eaa9-46b3-8279-50e5cfe740a0 could not be found. {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1025.454575] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance disappeared during build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1025.454666] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1025.454908] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquiring lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.455070] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Acquired lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1025.455227] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1025.463449] env[59382]: DEBUG nova.compute.utils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Can not refresh info_cache because instance was not found {{(pid=59382) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1025.469790] env[59382]: DEBUG nova.network.neutron [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Successfully updated port: e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1025.479521] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Acquiring lock "refresh_cache-55c244ec-daa2-4eef-8de3-324d0815026b" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.479663] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Acquired lock "refresh_cache-55c244ec-daa2-4eef-8de3-324d0815026b" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1025.479809] env[59382]: DEBUG nova.network.neutron [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1025.493878] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1025.496765] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1025.524762] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:42:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1025.525039] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1025.525219] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1025.525411] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1025.525574] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1025.525720] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1025.526029] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1025.526246] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1025.526463] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1025.526784] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1025.526848] env[59382]: DEBUG nova.virt.hardware [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1025.527775] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10c659fc-5362-4051-980b-47ea96b64d52 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.535661] env[59382]: DEBUG nova.policy [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fd6a3da0a267418dac77b2800f2ecf7e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dac32643412b44dfa237f923136bb01b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 1025.541564] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e629cb38-65cf-4748-a89b-20dd3440dd86 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.556121] env[59382]: DEBUG nova.network.neutron [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1025.604155] env[59382]: DEBUG nova.network.neutron [req-13c430a1-8c46-46c3-a10e-23ac7dc25aa7 req-6cae168e-5a87-4bda-8bac-88a0848bca9e service nova] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Updated VIF entry in instance network info cache for port cf7c2d90-e7e6-4c8e-a1ea-abbebb909958. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1025.604759] env[59382]: DEBUG nova.network.neutron [req-13c430a1-8c46-46c3-a10e-23ac7dc25aa7 req-6cae168e-5a87-4bda-8bac-88a0848bca9e service nova] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Updating instance_info_cache with network_info: [{"id": "cf7c2d90-e7e6-4c8e-a1ea-abbebb909958", "address": "fa:16:3e:a3:82:8b", "network": {"id": "c8d90262-c881-4516-a6e9-4cd30716b02b", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1874138198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ec3ea647eaf467cbd033067f6e4dbfa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcf7c2d90-e7", "ovs_interfaceid": "cf7c2d90-e7e6-4c8e-a1ea-abbebb909958", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1025.615502] env[59382]: DEBUG oslo_concurrency.lockutils [req-13c430a1-8c46-46c3-a10e-23ac7dc25aa7 req-6cae168e-5a87-4bda-8bac-88a0848bca9e service nova] Releasing lock "refresh_cache-2dfb7e00-fea7-4186-914a-98e1e5fbe49a" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1025.774024] env[59382]: DEBUG nova.network.neutron [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Updating instance_info_cache with network_info: [{"id": "e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4", "address": "fa:16:3e:99:4d:3f", "network": {"id": "b08f6f0d-7e63-4899-9ba2-81aad8fd6545", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1135306959-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fe7ee8ba08b5483dba2ba9f7f0851021", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ebd8af-aaf6-4d04-b869-3882e2571ed7", "external-id": "nsx-vlan-transportzone-541", "segmentation_id": 541, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape4f99fe2-3a", "ovs_interfaceid": "e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1025.781209] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1025.781512] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1025.781759] env[59382]: DEBUG oslo_concurrency.lockutils [None req-33ad1129-8ff2-44ab-9e88-d2f9f9857501 tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.789689] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Releasing lock "refresh_cache-55c244ec-daa2-4eef-8de3-324d0815026b" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1025.789961] env[59382]: DEBUG nova.compute.manager [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Instance network_info: |[{"id": "e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4", "address": "fa:16:3e:99:4d:3f", "network": {"id": "b08f6f0d-7e63-4899-9ba2-81aad8fd6545", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1135306959-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fe7ee8ba08b5483dba2ba9f7f0851021", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ebd8af-aaf6-4d04-b869-3882e2571ed7", "external-id": "nsx-vlan-transportzone-541", "segmentation_id": 541, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape4f99fe2-3a", "ovs_interfaceid": "e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1025.790319] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:99:4d:3f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '04ebd8af-aaf6-4d04-b869-3882e2571ed7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1025.797703] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Creating folder: Project (fe7ee8ba08b5483dba2ba9f7f0851021). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1025.798232] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dcf0ccbb-7206-4cf6-bce2-176481d6516d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.811552] env[59382]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1025.811723] env[59382]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=59382) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 1025.812030] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Folder already exists: Project (fe7ee8ba08b5483dba2ba9f7f0851021). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1025.812228] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Creating folder: Instances. Parent ref: group-v459792. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1025.812457] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c83f2ba5-a4ad-4de9-a92d-fc113b232547 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.821673] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Created folder: Instances in parent group-v459792. [ 1025.821892] env[59382]: DEBUG oslo.service.loopingcall [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1025.822093] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1025.822274] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b43e2404-8b94-4d49-8c59-f67a9403c01c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.841215] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1025.841215] env[59382]: value = "task-2256789" [ 1025.841215] env[59382]: _type = "Task" [ 1025.841215] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1025.850469] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256789, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1025.865220] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1025.875500] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Releasing lock "refresh_cache-d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1025.875785] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1025.875922] env[59382]: DEBUG nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1025.876093] env[59382]: DEBUG nova.network.neutron [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1025.888204] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256786, 'name': CreateVM_Task, 'duration_secs': 0.335851} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1025.888385] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1025.889162] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.889486] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1025.889776] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1025.890048] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2aa0df7c-893c-4488-ab7d-509a7b1d5365 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.894672] env[59382]: DEBUG oslo_vmware.api [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Waiting for the task: (returnval){ [ 1025.894672] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]521fd2b9-ac36-648f-bb6e-b2188bad3e68" [ 1025.894672] env[59382]: _type = "Task" [ 1025.894672] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1025.903700] env[59382]: DEBUG oslo_vmware.api [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]521fd2b9-ac36-648f-bb6e-b2188bad3e68, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1025.972275] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Successfully updated port: 4b811be5-378f-4876-aa4d-571a92ce5cf5 {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1025.983364] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquiring lock "refresh_cache-e57c71dc-fdb2-4861-b716-c6caebd6c29e" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.983571] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquired lock "refresh_cache-e57c71dc-fdb2-4861-b716-c6caebd6c29e" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1025.983759] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1026.050069] env[59382]: DEBUG neutronclient.v2_0.client [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59382) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1026.052179] env[59382]: ERROR nova.compute.manager [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Traceback (most recent call last): [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] exception_handler_v20(status_code, error_body) [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise client_exc(message=error_message, [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Neutron server returns request_ids: ['req-cee97b82-1e32-4481-87a9-4050fc6803e9'] [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] During handling of the above exception, another exception occurred: [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Traceback (most recent call last): [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 2881, in _build_resources [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self._shutdown_instance(context, instance, [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 3140, in _shutdown_instance [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self._try_deallocate_network(context, instance, requested_networks) [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 3054, in _try_deallocate_network [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] with excutils.save_and_reraise_exception(): [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self.force_reraise() [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise self.value [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 3052, in _try_deallocate_network [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] _deallocate_network_with_retries() [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 436, in func [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return evt.wait() [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] result = hub.switch() [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self.greenlet.switch() [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] result = func(*self.args, **self.kw) [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py", line 407, in _func [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] result = f(*args, **kwargs) [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 3041, in _deallocate_network_with_retries [ 1026.052179] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self._deallocate_network( [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self.network_api.deallocate_for_instance( [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] data = neutron.list_ports(**search_opts) [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self.list('ports', self.ports_path, retrieve_all, [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] for r in self._pagination(collection, path, **params): [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] res = self.get(path, params=params) [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self.retry_request("GET", action, body=body, [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self.do_request(method, action, body=body, [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self._handle_fault_response(status_code, replybody, resp) [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] During handling of the above exception, another exception occurred: [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Traceback (most recent call last): [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 2594, in _build_and_run_instance [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] with self._build_resources(context, instance, [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__ [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self.gen.throw(typ, value, traceback) [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 2889, in _build_resources [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise exception.BuildAbortException( [ 1026.053104] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] nova.exception.BuildAbortException: Build of instance d3d59ff4-eaa9-46b3-8279-50e5cfe740a0 aborted: Not authorized for image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71. [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] During handling of the above exception, another exception occurred: [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Traceback (most recent call last): [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self._build_and_run_instance(context, instance, image, [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] with excutils.save_and_reraise_exception(): [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self.force_reraise() [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise self.value [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] with self.rt.instance_claim(context, instance, node, allocs, [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self.abort() [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return f(*args, **kwargs) [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self._unset_instance_host_and_node(instance) [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] instance.save() [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] updates, result = self.indirection_api.object_action( [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return cctxt.call(context, 'object_action', objinst=objinst, [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] result = self.transport._send( [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self._driver.send(target, ctxt, message, [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise result [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] nova.exception_Remote.InstanceNotFound_Remote: Instance d3d59ff4-eaa9-46b3-8279-50e5cfe740a0 could not be found. [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Traceback (most recent call last): [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054007] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return getattr(target, method)(*args, **kwargs) [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return fn(self, *args, **kwargs) [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] old_ref, inst_ref = db.instance_update_and_get_original( [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return f(*args, **kwargs) [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] with excutils.save_and_reraise_exception() as ectxt: [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self.force_reraise() [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise self.value [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return f(*args, **kwargs) [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return f(context, *args, **kwargs) [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise exception.InstanceNotFound(instance_id=uuid) [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] nova.exception.InstanceNotFound: Instance d3d59ff4-eaa9-46b3-8279-50e5cfe740a0 could not be found. [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] During handling of the above exception, another exception occurred: [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Traceback (most recent call last): [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] exception_handler_v20(status_code, error_body) [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise client_exc(message=error_message, [ 1026.054913] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Neutron server returns request_ids: ['req-fc1d2fda-775a-4275-9d0b-2a33541758f5'] [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] During handling of the above exception, another exception occurred: [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Traceback (most recent call last): [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self._deallocate_network(context, instance, requested_networks) [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self.network_api.deallocate_for_instance( [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] data = neutron.list_ports(**search_opts) [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self.list('ports', self.ports_path, retrieve_all, [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] for r in self._pagination(collection, path, **params): [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] res = self.get(path, params=params) [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self.retry_request("GET", action, body=body, [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] return self.do_request(method, action, body=body, [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] ret = obj(*args, **kwargs) [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] self._handle_fault_response(status_code, replybody, resp) [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] raise exception.Unauthorized() [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] nova.exception.Unauthorized: Not authorized. [ 1026.055929] env[59382]: ERROR nova.compute.manager [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] [ 1026.059070] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1026.072918] env[59382]: DEBUG nova.compute.manager [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Received event network-changed-a4f52aeb-2cd7-4423-a960-0819eb5ab3a9 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1026.073131] env[59382]: DEBUG nova.compute.manager [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Refreshing instance network info cache due to event network-changed-a4f52aeb-2cd7-4423-a960-0819eb5ab3a9. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 1026.073339] env[59382]: DEBUG oslo_concurrency.lockutils [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] Acquiring lock "refresh_cache-81f08c14-ee4b-4954-bf53-dc02bb600279" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1026.073478] env[59382]: DEBUG oslo_concurrency.lockutils [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] Acquired lock "refresh_cache-81f08c14-ee4b-4954-bf53-dc02bb600279" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1026.073634] env[59382]: DEBUG nova.network.neutron [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Refreshing network info cache for port a4f52aeb-2cd7-4423-a960-0819eb5ab3a9 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1026.085966] env[59382]: DEBUG oslo_concurrency.lockutils [None req-0f8256bd-eab8-412c-b151-0b446a4b72a3 tempest-ListImageFiltersTestJSON-1735286212 tempest-ListImageFiltersTestJSON-1735286212-project-member] Lock "d3d59ff4-eaa9-46b3-8279-50e5cfe740a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 393.482s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1026.096110] env[59382]: DEBUG nova.compute.manager [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Starting instance... {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1026.158545] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1026.158821] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1026.160301] env[59382]: INFO nova.compute.claims [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1026.251011] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Updating instance_info_cache with network_info: [{"id": "4b811be5-378f-4876-aa4d-571a92ce5cf5", "address": "fa:16:3e:69:21:59", "network": {"id": "a2a2ab84-0df2-4a95-9509-85c2de2506f2", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1333789506-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dac32643412b44dfa237f923136bb01b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b811be5-37", "ovs_interfaceid": "4b811be5-378f-4876-aa4d-571a92ce5cf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1026.264178] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Releasing lock "refresh_cache-e57c71dc-fdb2-4861-b716-c6caebd6c29e" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1026.264482] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Instance network_info: |[{"id": "4b811be5-378f-4876-aa4d-571a92ce5cf5", "address": "fa:16:3e:69:21:59", "network": {"id": "a2a2ab84-0df2-4a95-9509-85c2de2506f2", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1333789506-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dac32643412b44dfa237f923136bb01b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b811be5-37", "ovs_interfaceid": "4b811be5-378f-4876-aa4d-571a92ce5cf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1026.264836] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:69:21:59', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '21310d90-efbc-45a8-a97f-c4358606530f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4b811be5-378f-4876-aa4d-571a92ce5cf5', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1026.272406] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Creating folder: Project (dac32643412b44dfa237f923136bb01b). Parent ref: group-v459741. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1026.272899] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-845e82db-b49c-4eb4-8824-6be66271180c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.284079] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Created folder: Project (dac32643412b44dfa237f923136bb01b) in parent group-v459741. [ 1026.284302] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Creating folder: Instances. Parent ref: group-v459810. {{(pid=59382) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1026.284627] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c5822654-f2cb-417d-bcef-2ef31d056034 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.294165] env[59382]: INFO nova.virt.vmwareapi.vm_util [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Created folder: Instances in parent group-v459810. [ 1026.294311] env[59382]: DEBUG oslo.service.loopingcall [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1026.294469] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1026.294675] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c8445bd9-b4f4-4863-b5cf-1a04f12e8945 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.317545] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1026.317545] env[59382]: value = "task-2256792" [ 1026.317545] env[59382]: _type = "Task" [ 1026.317545] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.325630] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256792, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1026.354304] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256789, 'name': CreateVM_Task, 'duration_secs': 0.302526} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1026.354512] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1026.355191] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'mount_device': '/dev/sda', 'device_type': None, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-459795', 'volume_id': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'name': 'volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '55c244ec-daa2-4eef-8de3-324d0815026b', 'attached_at': '', 'detached_at': '', 'volume_id': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'serial': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128'}, 'boot_index': 0, 'delete_on_termination': True, 'disk_bus': None, 'attachment_id': '126d0df8-8fff-41fa-b8cf-8c49efac2e46', 'guest_format': None, 'volume_type': None}], 'swap': None} {{(pid=59382) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1026.355428] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Root volume attach. Driver type: vmdk {{(pid=59382) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1026.356215] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4611387f-9c18-41bd-a1f9-22355bb8abeb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.364577] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91cfad4b-9fa3-43c1-81b9-6d69647c0cde {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.376269] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f18d0ec2-2084-41ff-bfbb-1ae471308ed7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.384086] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Successfully created port: 77c63660-6775-4d77-9825-698c6ef5ea2e {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1026.391970] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-ca98b20c-fc2c-4e4d-af6a-12c809c7331d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.407060] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1026.407322] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1026.407539] env[59382]: DEBUG oslo_concurrency.lockutils [None req-58d70e1c-f80b-42ce-bd95-84c6d1c9e99b tempest-ListServerFiltersTestJSON-1615496023 tempest-ListServerFiltersTestJSON-1615496023-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1026.407794] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for the task: (returnval){ [ 1026.407794] env[59382]: value = "task-2256793" [ 1026.407794] env[59382]: _type = "Task" [ 1026.407794] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.412028] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6b5efb8-ab47-468a-8fb5-b378a71a1e53 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.419457] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256793, 'name': RelocateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1026.421927] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b6797c8-6fef-46b0-b7d6-f4651ae8c562 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.452723] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cc1dc98-3513-4a7f-81e5-b8ea483beb43 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.460037] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88d7bb04-3a75-47c3-889a-e2f233f177f1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.473408] env[59382]: DEBUG nova.compute.provider_tree [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1026.484598] env[59382]: DEBUG nova.scheduler.client.report [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1026.501408] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.342s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1026.501921] env[59382]: DEBUG nova.compute.manager [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Start building networks asynchronously for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1026.527103] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1026.538028] env[59382]: DEBUG nova.compute.utils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Using /dev/sd instead of None {{(pid=59382) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1026.539830] env[59382]: DEBUG nova.compute.manager [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Allocating IP information in the background. {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1026.539931] env[59382]: DEBUG nova.network.neutron [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] allocate_for_instance() {{(pid=59382) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1026.549624] env[59382]: DEBUG nova.compute.manager [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Start building block device mappings for instance. {{(pid=59382) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1026.617469] env[59382]: DEBUG nova.policy [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8949b5c249e47bbba781a4aa0d0f065', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9e07673924d647a9aa97917daba6b838', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59382) authorize /opt/stack/nova/nova/policy.py:203}} [ 1026.627923] env[59382]: DEBUG nova.compute.manager [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Start spawning the instance on the hypervisor. {{(pid=59382) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1026.634038] env[59382]: DEBUG nova.network.neutron [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Updated VIF entry in instance network info cache for port a4f52aeb-2cd7-4423-a960-0819eb5ab3a9. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1026.634365] env[59382]: DEBUG nova.network.neutron [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Updating instance_info_cache with network_info: [{"id": "a4f52aeb-2cd7-4423-a960-0819eb5ab3a9", "address": "fa:16:3e:3a:d8:f5", "network": {"id": "c8d90262-c881-4516-a6e9-4cd30716b02b", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1874138198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ec3ea647eaf467cbd033067f6e4dbfa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa4f52aeb-2c", "ovs_interfaceid": "a4f52aeb-2cd7-4423-a960-0819eb5ab3a9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1026.642813] env[59382]: DEBUG oslo_concurrency.lockutils [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] Releasing lock "refresh_cache-81f08c14-ee4b-4954-bf53-dc02bb600279" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1026.643251] env[59382]: DEBUG nova.compute.manager [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Received event network-vif-plugged-72e7e0b8-4690-4d93-9e83-3da57109f5d6 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1026.643251] env[59382]: DEBUG oslo_concurrency.lockutils [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] Acquiring lock "05e46e58-1de8-48a0-a139-c202d77e85ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1026.643435] env[59382]: DEBUG oslo_concurrency.lockutils [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] Lock "05e46e58-1de8-48a0-a139-c202d77e85ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1026.643592] env[59382]: DEBUG oslo_concurrency.lockutils [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] Lock "05e46e58-1de8-48a0-a139-c202d77e85ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1026.644177] env[59382]: DEBUG nova.compute.manager [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] No waiting events found dispatching network-vif-plugged-72e7e0b8-4690-4d93-9e83-3da57109f5d6 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1026.644177] env[59382]: WARNING nova.compute.manager [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Received unexpected event network-vif-plugged-72e7e0b8-4690-4d93-9e83-3da57109f5d6 for instance with vm_state building and task_state spawning. [ 1026.644177] env[59382]: DEBUG nova.compute.manager [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Received event network-changed-72e7e0b8-4690-4d93-9e83-3da57109f5d6 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1026.644348] env[59382]: DEBUG nova.compute.manager [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Refreshing instance network info cache due to event network-changed-72e7e0b8-4690-4d93-9e83-3da57109f5d6. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 1026.644409] env[59382]: DEBUG oslo_concurrency.lockutils [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] Acquiring lock "refresh_cache-05e46e58-1de8-48a0-a139-c202d77e85ad" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1026.644545] env[59382]: DEBUG oslo_concurrency.lockutils [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] Acquired lock "refresh_cache-05e46e58-1de8-48a0-a139-c202d77e85ad" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1026.644700] env[59382]: DEBUG nova.network.neutron [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Refreshing network info cache for port 72e7e0b8-4690-4d93-9e83-3da57109f5d6 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1026.657343] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-02-04T04:50:06Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='bbedebc1-43d2-4a8a-9925-29ea8e6fe7f7',id=38,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-620729087',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-02-04T04:42:22Z,direct_url=,disk_format='vmdk',id=4092f5d9-e52b-450e-8bc7-85f1a22d3b71,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='07053f9d79ae41c3ae8ab170afb47449',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-02-04T04:42:23Z,virtual_size=,visibility=), allow threads: False {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1026.657640] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Flavor limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1026.657741] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Image limits 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1026.658172] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Flavor pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1026.658172] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Image pref 0:0:0 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1026.658268] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59382) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1026.658420] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1026.658577] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1026.658740] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Got 1 possible topologies {{(pid=59382) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1026.658899] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1026.659131] env[59382]: DEBUG nova.virt.hardware [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59382) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1026.660225] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e76f8e39-1534-4d02-9e9a-95904e223d33 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.671978] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73bd2538-1137-45c8-a922-bddaa2f43618 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.829504] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256792, 'name': CreateVM_Task, 'duration_secs': 0.293849} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1026.829699] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1026.830373] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1026.830542] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1026.830874] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1026.831146] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d082162a-807e-4c2f-aa40-fa81d421692e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.838377] env[59382]: DEBUG oslo_vmware.api [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Waiting for the task: (returnval){ [ 1026.838377] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5209dcbd-4b05-3142-84d7-9a947d4dd15a" [ 1026.838377] env[59382]: _type = "Task" [ 1026.838377] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.848388] env[59382]: DEBUG oslo_vmware.api [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]5209dcbd-4b05-3142-84d7-9a947d4dd15a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1026.918150] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256793, 'name': RelocateVM_Task} progress is 40%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1026.947905] env[59382]: DEBUG nova.network.neutron [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Successfully created port: 14781ef3-1c0a-4b58-9415-43e1d4a6766f {{(pid=59382) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1026.997023] env[59382]: DEBUG nova.compute.manager [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Received event network-vif-plugged-e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1026.997644] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Acquiring lock "55c244ec-daa2-4eef-8de3-324d0815026b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1027.001306] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Lock "55c244ec-daa2-4eef-8de3-324d0815026b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1027.001616] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Lock "55c244ec-daa2-4eef-8de3-324d0815026b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.004s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1027.001907] env[59382]: DEBUG nova.compute.manager [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] No waiting events found dispatching network-vif-plugged-e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1027.001960] env[59382]: WARNING nova.compute.manager [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Received unexpected event network-vif-plugged-e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4 for instance with vm_state building and task_state spawning. [ 1027.002101] env[59382]: DEBUG nova.compute.manager [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Received event network-changed-e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1027.002253] env[59382]: DEBUG nova.compute.manager [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Refreshing instance network info cache due to event network-changed-e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 1027.002473] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Acquiring lock "refresh_cache-55c244ec-daa2-4eef-8de3-324d0815026b" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1027.002565] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Acquired lock "refresh_cache-55c244ec-daa2-4eef-8de3-324d0815026b" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1027.002717] env[59382]: DEBUG nova.network.neutron [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Refreshing network info cache for port e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1027.219733] env[59382]: DEBUG nova.network.neutron [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Updated VIF entry in instance network info cache for port 72e7e0b8-4690-4d93-9e83-3da57109f5d6. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1027.220120] env[59382]: DEBUG nova.network.neutron [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Updating instance_info_cache with network_info: [{"id": "72e7e0b8-4690-4d93-9e83-3da57109f5d6", "address": "fa:16:3e:7a:91:20", "network": {"id": "c8d90262-c881-4516-a6e9-4cd30716b02b", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1874138198-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7ec3ea647eaf467cbd033067f6e4dbfa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "47499d09-8010-4d02-ac96-4f057c104692", "external-id": "nsx-vlan-transportzone-14", "segmentation_id": 14, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72e7e0b8-46", "ovs_interfaceid": "72e7e0b8-4690-4d93-9e83-3da57109f5d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.233650] env[59382]: DEBUG oslo_concurrency.lockutils [req-1fec685b-09a7-4dab-97f4-d47541608284 req-bc91a62f-0276-4d72-9d62-524c9f6b41a9 service nova] Releasing lock "refresh_cache-05e46e58-1de8-48a0-a139-c202d77e85ad" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1027.351376] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1027.355081] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1027.355081] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1027.422416] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256793, 'name': RelocateVM_Task} progress is 54%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.458319] env[59382]: DEBUG nova.network.neutron [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Updated VIF entry in instance network info cache for port e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1027.458717] env[59382]: DEBUG nova.network.neutron [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Updating instance_info_cache with network_info: [{"id": "e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4", "address": "fa:16:3e:99:4d:3f", "network": {"id": "b08f6f0d-7e63-4899-9ba2-81aad8fd6545", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1135306959-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fe7ee8ba08b5483dba2ba9f7f0851021", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ebd8af-aaf6-4d04-b869-3882e2571ed7", "external-id": "nsx-vlan-transportzone-541", "segmentation_id": 541, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape4f99fe2-3a", "ovs_interfaceid": "e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.469998] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Releasing lock "refresh_cache-55c244ec-daa2-4eef-8de3-324d0815026b" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1027.470358] env[59382]: DEBUG nova.compute.manager [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Received event network-vif-plugged-4b811be5-378f-4876-aa4d-571a92ce5cf5 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1027.470557] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Acquiring lock "e57c71dc-fdb2-4861-b716-c6caebd6c29e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1027.470765] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Lock "e57c71dc-fdb2-4861-b716-c6caebd6c29e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1027.470916] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Lock "e57c71dc-fdb2-4861-b716-c6caebd6c29e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1027.471179] env[59382]: DEBUG nova.compute.manager [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] No waiting events found dispatching network-vif-plugged-4b811be5-378f-4876-aa4d-571a92ce5cf5 {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1027.471272] env[59382]: WARNING nova.compute.manager [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Received unexpected event network-vif-plugged-4b811be5-378f-4876-aa4d-571a92ce5cf5 for instance with vm_state building and task_state spawning. [ 1027.471447] env[59382]: DEBUG nova.compute.manager [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Received event network-changed-4b811be5-378f-4876-aa4d-571a92ce5cf5 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1027.471580] env[59382]: DEBUG nova.compute.manager [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Refreshing instance network info cache due to event network-changed-4b811be5-378f-4876-aa4d-571a92ce5cf5. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 1027.471841] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Acquiring lock "refresh_cache-e57c71dc-fdb2-4861-b716-c6caebd6c29e" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1027.471983] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Acquired lock "refresh_cache-e57c71dc-fdb2-4861-b716-c6caebd6c29e" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1027.473049] env[59382]: DEBUG nova.network.neutron [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Refreshing network info cache for port 4b811be5-378f-4876-aa4d-571a92ce5cf5 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1027.527424] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1027.572940] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Successfully updated port: 77c63660-6775-4d77-9825-698c6ef5ea2e {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1027.581980] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquiring lock "refresh_cache-5fd51316-ab8f-4501-8389-de12a294f8da" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1027.582214] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquired lock "refresh_cache-5fd51316-ab8f-4501-8389-de12a294f8da" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1027.582315] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1027.635252] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1027.848616] env[59382]: DEBUG nova.network.neutron [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Updating instance_info_cache with network_info: [{"id": "77c63660-6775-4d77-9825-698c6ef5ea2e", "address": "fa:16:3e:03:de:dd", "network": {"id": "a2a2ab84-0df2-4a95-9509-85c2de2506f2", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1333789506-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dac32643412b44dfa237f923136bb01b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap77c63660-67", "ovs_interfaceid": "77c63660-6775-4d77-9825-698c6ef5ea2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.861529] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Releasing lock "refresh_cache-5fd51316-ab8f-4501-8389-de12a294f8da" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1027.861854] env[59382]: DEBUG nova.compute.manager [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Instance network_info: |[{"id": "77c63660-6775-4d77-9825-698c6ef5ea2e", "address": "fa:16:3e:03:de:dd", "network": {"id": "a2a2ab84-0df2-4a95-9509-85c2de2506f2", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1333789506-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dac32643412b44dfa237f923136bb01b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap77c63660-67", "ovs_interfaceid": "77c63660-6775-4d77-9825-698c6ef5ea2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1027.862459] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:03:de:dd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '21310d90-efbc-45a8-a97f-c4358606530f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '77c63660-6775-4d77-9825-698c6ef5ea2e', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1027.870580] env[59382]: DEBUG oslo.service.loopingcall [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1027.871169] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1027.871396] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-41cd4df1-05ee-43a3-8939-7e07e19f4526 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.896219] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1027.896219] env[59382]: value = "task-2256794" [ 1027.896219] env[59382]: _type = "Task" [ 1027.896219] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1027.907760] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256794, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.921070] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256793, 'name': RelocateVM_Task} progress is 67%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1028.047532] env[59382]: DEBUG nova.network.neutron [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Updated VIF entry in instance network info cache for port 4b811be5-378f-4876-aa4d-571a92ce5cf5. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1028.048054] env[59382]: DEBUG nova.network.neutron [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Updating instance_info_cache with network_info: [{"id": "4b811be5-378f-4876-aa4d-571a92ce5cf5", "address": "fa:16:3e:69:21:59", "network": {"id": "a2a2ab84-0df2-4a95-9509-85c2de2506f2", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1333789506-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dac32643412b44dfa237f923136bb01b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4b811be5-37", "ovs_interfaceid": "4b811be5-378f-4876-aa4d-571a92ce5cf5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1028.059660] env[59382]: DEBUG oslo_concurrency.lockutils [req-70628521-f5d1-4814-9e99-ed1677231b1f req-b1a91612-1f53-4014-8bea-c3cb1d3d7142 service nova] Releasing lock "refresh_cache-e57c71dc-fdb2-4861-b716-c6caebd6c29e" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1028.095042] env[59382]: DEBUG nova.network.neutron [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Successfully updated port: 14781ef3-1c0a-4b58-9415-43e1d4a6766f {{(pid=59382) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1028.098379] env[59382]: DEBUG nova.compute.manager [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Received event network-vif-plugged-77c63660-6775-4d77-9825-698c6ef5ea2e {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1028.098530] env[59382]: DEBUG oslo_concurrency.lockutils [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] Acquiring lock "5fd51316-ab8f-4501-8389-de12a294f8da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1028.098717] env[59382]: DEBUG oslo_concurrency.lockutils [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] Lock "5fd51316-ab8f-4501-8389-de12a294f8da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1028.098933] env[59382]: DEBUG oslo_concurrency.lockutils [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] Lock "5fd51316-ab8f-4501-8389-de12a294f8da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.099080] env[59382]: DEBUG nova.compute.manager [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] No waiting events found dispatching network-vif-plugged-77c63660-6775-4d77-9825-698c6ef5ea2e {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1028.099316] env[59382]: WARNING nova.compute.manager [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Received unexpected event network-vif-plugged-77c63660-6775-4d77-9825-698c6ef5ea2e for instance with vm_state building and task_state spawning. [ 1028.099462] env[59382]: DEBUG nova.compute.manager [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Received event network-changed-77c63660-6775-4d77-9825-698c6ef5ea2e {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1028.099553] env[59382]: DEBUG nova.compute.manager [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Refreshing instance network info cache due to event network-changed-77c63660-6775-4d77-9825-698c6ef5ea2e. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 1028.099717] env[59382]: DEBUG oslo_concurrency.lockutils [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] Acquiring lock "refresh_cache-5fd51316-ab8f-4501-8389-de12a294f8da" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1028.099858] env[59382]: DEBUG oslo_concurrency.lockutils [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] Acquired lock "refresh_cache-5fd51316-ab8f-4501-8389-de12a294f8da" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1028.100026] env[59382]: DEBUG nova.network.neutron [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Refreshing network info cache for port 77c63660-6775-4d77-9825-698c6ef5ea2e {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1028.115149] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "refresh_cache-8ea743ab-33df-4834-b1e7-2ef7f1e1a147" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1028.115317] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquired lock "refresh_cache-8ea743ab-33df-4834-b1e7-2ef7f1e1a147" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1028.115470] env[59382]: DEBUG nova.network.neutron [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1028.177129] env[59382]: DEBUG nova.network.neutron [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1028.411497] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256794, 'name': CreateVM_Task, 'duration_secs': 0.388441} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1028.414788] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1028.415550] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1028.415701] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1028.416025] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1028.416790] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2dce324b-b98d-4ed3-934f-0fd00093469b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.425168] env[59382]: DEBUG oslo_vmware.api [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Waiting for the task: (returnval){ [ 1028.425168] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52d62abb-fee5-bfc2-8c5e-c9d4d01aa9c2" [ 1028.425168] env[59382]: _type = "Task" [ 1028.425168] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1028.425168] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256793, 'name': RelocateVM_Task} progress is 82%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1028.435581] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1028.435797] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1028.436022] env[59382]: DEBUG oslo_concurrency.lockutils [None req-e34227b9-c5cc-4c0a-b119-4d35c85f00f6 tempest-MultipleCreateTestJSON-519144173 tempest-MultipleCreateTestJSON-519144173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1028.489456] env[59382]: DEBUG nova.network.neutron [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Updating instance_info_cache with network_info: [{"id": "14781ef3-1c0a-4b58-9415-43e1d4a6766f", "address": "fa:16:3e:75:b7:b5", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap14781ef3-1c", "ovs_interfaceid": "14781ef3-1c0a-4b58-9415-43e1d4a6766f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1028.501564] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Releasing lock "refresh_cache-8ea743ab-33df-4834-b1e7-2ef7f1e1a147" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1028.501904] env[59382]: DEBUG nova.compute.manager [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Instance network_info: |[{"id": "14781ef3-1c0a-4b58-9415-43e1d4a6766f", "address": "fa:16:3e:75:b7:b5", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap14781ef3-1c", "ovs_interfaceid": "14781ef3-1c0a-4b58-9415-43e1d4a6766f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59382) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1028.502319] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:75:b7:b5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '97b68ed7-8461-4345-b064-96a1dde53a86', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '14781ef3-1c0a-4b58-9415-43e1d4a6766f', 'vif_model': 'vmxnet3'}] {{(pid=59382) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1028.514585] env[59382]: DEBUG oslo.service.loopingcall [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1028.517056] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Creating VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1028.517056] env[59382]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-970be7e5-3faa-45ea-826d-c4427da36afa {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.532483] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1028.532483] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Starting heal instance info cache {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9858}} [ 1028.532483] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Rebuilding the list of instances to heal {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9862}} [ 1028.541403] env[59382]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1028.541403] env[59382]: value = "task-2256795" [ 1028.541403] env[59382]: _type = "Task" [ 1028.541403] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1028.551539] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256795, 'name': CreateVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1028.553220] env[59382]: DEBUG nova.network.neutron [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Updated VIF entry in instance network info cache for port 77c63660-6775-4d77-9825-698c6ef5ea2e. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1028.553569] env[59382]: DEBUG nova.network.neutron [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Updating instance_info_cache with network_info: [{"id": "77c63660-6775-4d77-9825-698c6ef5ea2e", "address": "fa:16:3e:03:de:dd", "network": {"id": "a2a2ab84-0df2-4a95-9509-85c2de2506f2", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1333789506-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dac32643412b44dfa237f923136bb01b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap77c63660-67", "ovs_interfaceid": "77c63660-6775-4d77-9825-698c6ef5ea2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1028.556504] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 1028.556902] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 1028.556902] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 1028.556902] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 1028.557090] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 1028.557168] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 1028.557294] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 1028.557488] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 1028.557672] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 1028.557801] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Didn't find any instances for network info cache update. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9944}} [ 1028.563890] env[59382]: DEBUG oslo_concurrency.lockutils [req-f6f8b6f0-0d97-4cfc-8c03-5253d4322aa3 req-300b1c17-1757-4f81-a2e2-e0b13bf613bf service nova] Releasing lock "refresh_cache-5fd51316-ab8f-4501-8389-de12a294f8da" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1028.923176] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256793, 'name': RelocateVM_Task} progress is 97%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1029.026800] env[59382]: DEBUG nova.compute.manager [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Received event network-vif-plugged-14781ef3-1c0a-4b58-9415-43e1d4a6766f {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1029.027130] env[59382]: DEBUG oslo_concurrency.lockutils [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] Acquiring lock "8ea743ab-33df-4834-b1e7-2ef7f1e1a147-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1029.027454] env[59382]: DEBUG oslo_concurrency.lockutils [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] Lock "8ea743ab-33df-4834-b1e7-2ef7f1e1a147-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1029.027717] env[59382]: DEBUG oslo_concurrency.lockutils [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] Lock "8ea743ab-33df-4834-b1e7-2ef7f1e1a147-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.027957] env[59382]: DEBUG nova.compute.manager [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] No waiting events found dispatching network-vif-plugged-14781ef3-1c0a-4b58-9415-43e1d4a6766f {{(pid=59382) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1029.028214] env[59382]: WARNING nova.compute.manager [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Received unexpected event network-vif-plugged-14781ef3-1c0a-4b58-9415-43e1d4a6766f for instance with vm_state building and task_state spawning. [ 1029.028436] env[59382]: DEBUG nova.compute.manager [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Received event network-changed-14781ef3-1c0a-4b58-9415-43e1d4a6766f {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1029.028654] env[59382]: DEBUG nova.compute.manager [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Refreshing instance network info cache due to event network-changed-14781ef3-1c0a-4b58-9415-43e1d4a6766f. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 1029.028916] env[59382]: DEBUG oslo_concurrency.lockutils [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] Acquiring lock "refresh_cache-8ea743ab-33df-4834-b1e7-2ef7f1e1a147" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.029068] env[59382]: DEBUG oslo_concurrency.lockutils [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] Acquired lock "refresh_cache-8ea743ab-33df-4834-b1e7-2ef7f1e1a147" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1029.029279] env[59382]: DEBUG nova.network.neutron [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Refreshing network info cache for port 14781ef3-1c0a-4b58-9415-43e1d4a6766f {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1029.051539] env[59382]: DEBUG oslo_vmware.api [-] Task: {'id': task-2256795, 'name': CreateVM_Task, 'duration_secs': 0.359069} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1029.051718] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Created VM on the ESX host {{(pid=59382) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1029.052330] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.052505] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1029.052829] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1029.053102] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8d600394-46ef-40e8-8caf-504422b6fb5e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.057582] env[59382]: DEBUG oslo_vmware.api [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Waiting for the task: (returnval){ [ 1029.057582] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52bf6e88-c398-15c6-aa01-5f876f38115e" [ 1029.057582] env[59382]: _type = "Task" [ 1029.057582] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1029.065841] env[59382]: DEBUG oslo_vmware.api [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52bf6e88-c398-15c6-aa01-5f876f38115e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1029.268065] env[59382]: DEBUG nova.network.neutron [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Updated VIF entry in instance network info cache for port 14781ef3-1c0a-4b58-9415-43e1d4a6766f. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1029.268555] env[59382]: DEBUG nova.network.neutron [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Updating instance_info_cache with network_info: [{"id": "14781ef3-1c0a-4b58-9415-43e1d4a6766f", "address": "fa:16:3e:75:b7:b5", "network": {"id": "78382618-aef6-46b4-a38a-3d09992066c0", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "07053f9d79ae41c3ae8ab170afb47449", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "97b68ed7-8461-4345-b064-96a1dde53a86", "external-id": "nsx-vlan-transportzone-140", "segmentation_id": 140, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap14781ef3-1c", "ovs_interfaceid": "14781ef3-1c0a-4b58-9415-43e1d4a6766f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1029.278244] env[59382]: DEBUG oslo_concurrency.lockutils [req-e5ba46e6-5761-4c8e-8adb-7f9408d46e63 req-24545c50-f1ce-46f8-8890-1855a978f222 service nova] Releasing lock "refresh_cache-8ea743ab-33df-4834-b1e7-2ef7f1e1a147" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1029.421840] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256793, 'name': RelocateVM_Task} progress is 98%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1029.526681] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1029.526985] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1029.527163] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59382) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10477}} [ 1029.527324] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1029.537408] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1029.537648] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1029.537814] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.537970] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1029.539026] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b2e43eb-1adf-48ee-974b-bca812132be5 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.547635] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0afb42bf-537e-408a-84ad-3343ae21dd88 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.564038] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d742ff1b-6512-487b-9d47-54a8e048b237 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.573034] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1029.573272] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Processing image 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1029.573479] env[59382]: DEBUG oslo_concurrency.lockutils [None req-02176801-cfda-4bfe-ae1c-2a24c9fdf344 tempest-MigrationsAdminTest-392436684 tempest-MigrationsAdminTest-392436684-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.574417] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a66fda31-e75f-4c58-9bb7-f021764e6fa9 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.603837] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181241MB free_disk=170GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1029.603997] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1029.604207] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1029.667821] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 203e8cdb-621d-461a-97ba-3e3782f04d1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1029.667980] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 390366c5-ced3-4ac9-9687-c5d2895fbc1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1029.668127] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 2dfb7e00-fea7-4186-914a-98e1e5fbe49a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1029.668255] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 81f08c14-ee4b-4954-bf53-dc02bb600279 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1029.668406] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 05e46e58-1de8-48a0-a139-c202d77e85ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1029.668553] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 55c244ec-daa2-4eef-8de3-324d0815026b actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1029.668675] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance e57c71dc-fdb2-4861-b716-c6caebd6c29e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1029.668793] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 5fd51316-ab8f-4501-8389-de12a294f8da actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1029.668908] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 8ea743ab-33df-4834-b1e7-2ef7f1e1a147 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1029.669109] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1029.669256] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1728MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1029.783676] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d222033a-756b-4125-af2a-3043dc380796 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.791574] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a24e3aa8-8041-4191-80ea-55fb1fca5ab1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.820561] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81e54918-1fd7-4e7b-88eb-e5e380b01a3a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.827515] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35d1cdf1-3529-4f33-b378-a0fd1ac916f1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.840274] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1029.847841] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1029.860451] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1029.860630] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.256s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.922721] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256793, 'name': RelocateVM_Task, 'duration_secs': 3.078656} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1029.923014] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Volume attach. Driver type: vmdk {{(pid=59382) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1029.923240] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-459795', 'volume_id': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'name': 'volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '55c244ec-daa2-4eef-8de3-324d0815026b', 'attached_at': '', 'detached_at': '', 'volume_id': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'serial': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128'} {{(pid=59382) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1029.923994] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e981b49-10de-4ec0-9e6b-e90c94c8f6cb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.941597] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c28d85f-32ea-480a-a0e5-c8497a6753dc {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.963234] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Reconfiguring VM instance instance-0000001f to attach disk [datastore1] volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128/volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128.vmdk or device None with type thin {{(pid=59382) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1029.963521] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-df772924-d725-47ef-80db-4e6c65fe6d9b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.982902] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for the task: (returnval){ [ 1029.982902] env[59382]: value = "task-2256796" [ 1029.982902] env[59382]: _type = "Task" [ 1029.982902] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1029.990327] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256796, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1030.492762] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256796, 'name': ReconfigVM_Task, 'duration_secs': 0.272225} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1030.494052] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Reconfigured VM instance instance-0000001f to attach disk [datastore1] volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128/volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128.vmdk or device None with type thin {{(pid=59382) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1030.498010] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-2ac516ad-640f-4e76-9b0c-495ef005f928 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1030.513084] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for the task: (returnval){ [ 1030.513084] env[59382]: value = "task-2256797" [ 1030.513084] env[59382]: _type = "Task" [ 1030.513084] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1030.520631] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256797, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1030.856605] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1031.022504] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256797, 'name': ReconfigVM_Task, 'duration_secs': 0.135508} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1031.022811] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-459795', 'volume_id': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'name': 'volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '55c244ec-daa2-4eef-8de3-324d0815026b', 'attached_at': '', 'detached_at': '', 'volume_id': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'serial': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128'} {{(pid=59382) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1031.023370] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-91cb8a22-3327-4edb-9d06-e4b314a42577 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1031.029310] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for the task: (returnval){ [ 1031.029310] env[59382]: value = "task-2256798" [ 1031.029310] env[59382]: _type = "Task" [ 1031.029310] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1031.036479] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256798, 'name': Rename_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1031.540025] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256798, 'name': Rename_Task} progress is 99%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1032.040432] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256798, 'name': Rename_Task} progress is 99%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1032.526903] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1032.540172] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256798, 'name': Rename_Task, 'duration_secs': 1.124559} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1032.540447] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Powering on the VM {{(pid=59382) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1032.540693] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-ce5e6b4c-f6ec-4bfe-979f-db6e72793e64 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1032.547584] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for the task: (returnval){ [ 1032.547584] env[59382]: value = "task-2256799" [ 1032.547584] env[59382]: _type = "Task" [ 1032.547584] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1032.555797] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256799, 'name': PowerOnVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1033.056778] env[59382]: DEBUG oslo_vmware.api [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256799, 'name': PowerOnVM_Task, 'duration_secs': 0.457052} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1033.057110] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Powered on the VM {{(pid=59382) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 1033.057268] env[59382]: INFO nova.compute.manager [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Took 9.03 seconds to spawn the instance on the hypervisor. [ 1033.057836] env[59382]: DEBUG nova.compute.manager [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Checking state {{(pid=59382) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 1033.058420] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-700c7ab1-4c35-4a74-8cb5-18633a11a73e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1033.109618] env[59382]: INFO nova.compute.manager [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Took 9.99 seconds to build instance. [ 1033.120692] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fe2b893c-7044-499a-be31-37afb499cfcd tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lock "55c244ec-daa2-4eef-8de3-324d0815026b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 170.931s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1034.527445] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1035.041971] env[59382]: DEBUG nova.compute.manager [req-19f6cca7-87fa-4b8a-9c3b-012b44f1ccbe req-ac912519-6375-4d08-962e-69a44bd536dd service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Received event network-changed-e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1035.041971] env[59382]: DEBUG nova.compute.manager [req-19f6cca7-87fa-4b8a-9c3b-012b44f1ccbe req-ac912519-6375-4d08-962e-69a44bd536dd service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Refreshing instance network info cache due to event network-changed-e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4. {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11053}} [ 1035.042055] env[59382]: DEBUG oslo_concurrency.lockutils [req-19f6cca7-87fa-4b8a-9c3b-012b44f1ccbe req-ac912519-6375-4d08-962e-69a44bd536dd service nova] Acquiring lock "refresh_cache-55c244ec-daa2-4eef-8de3-324d0815026b" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1035.042217] env[59382]: DEBUG oslo_concurrency.lockutils [req-19f6cca7-87fa-4b8a-9c3b-012b44f1ccbe req-ac912519-6375-4d08-962e-69a44bd536dd service nova] Acquired lock "refresh_cache-55c244ec-daa2-4eef-8de3-324d0815026b" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1035.042364] env[59382]: DEBUG nova.network.neutron [req-19f6cca7-87fa-4b8a-9c3b-012b44f1ccbe req-ac912519-6375-4d08-962e-69a44bd536dd service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Refreshing network info cache for port e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4 {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1035.313123] env[59382]: DEBUG nova.network.neutron [req-19f6cca7-87fa-4b8a-9c3b-012b44f1ccbe req-ac912519-6375-4d08-962e-69a44bd536dd service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Updated VIF entry in instance network info cache for port e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4. {{(pid=59382) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1035.313123] env[59382]: DEBUG nova.network.neutron [req-19f6cca7-87fa-4b8a-9c3b-012b44f1ccbe req-ac912519-6375-4d08-962e-69a44bd536dd service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Updating instance_info_cache with network_info: [{"id": "e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4", "address": "fa:16:3e:99:4d:3f", "network": {"id": "b08f6f0d-7e63-4899-9ba2-81aad8fd6545", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1135306959-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.154", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fe7ee8ba08b5483dba2ba9f7f0851021", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ebd8af-aaf6-4d04-b869-3882e2571ed7", "external-id": "nsx-vlan-transportzone-541", "segmentation_id": 541, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape4f99fe2-3a", "ovs_interfaceid": "e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1035.325371] env[59382]: DEBUG oslo_concurrency.lockutils [req-19f6cca7-87fa-4b8a-9c3b-012b44f1ccbe req-ac912519-6375-4d08-962e-69a44bd536dd service nova] Releasing lock "refresh_cache-55c244ec-daa2-4eef-8de3-324d0815026b" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1049.619878] env[59382]: DEBUG nova.compute.manager [req-49236e4a-1f71-4347-86a0-be7ead3bf9ec req-f88d53b0-3106-4329-9229-fc5d7635508b service nova] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Received event network-vif-deleted-2265de5a-b9e6-429e-80ef-67f11ac0e930 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1051.284242] env[59382]: INFO nova.compute.manager [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Rebuilding instance [ 1051.323319] env[59382]: DEBUG nova.objects.instance [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lazy-loading 'trusted_certs' on Instance uuid 55c244ec-daa2-4eef-8de3-324d0815026b {{(pid=59382) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1051.335879] env[59382]: DEBUG nova.compute.manager [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Checking state {{(pid=59382) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 1051.337137] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-260bff0a-28df-4e32-95d5-9bc073d3a5a1 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1051.377559] env[59382]: DEBUG nova.objects.instance [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lazy-loading 'pci_requests' on Instance uuid 55c244ec-daa2-4eef-8de3-324d0815026b {{(pid=59382) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1051.387394] env[59382]: DEBUG nova.objects.instance [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lazy-loading 'pci_devices' on Instance uuid 55c244ec-daa2-4eef-8de3-324d0815026b {{(pid=59382) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1051.395206] env[59382]: DEBUG nova.objects.instance [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lazy-loading 'resources' on Instance uuid 55c244ec-daa2-4eef-8de3-324d0815026b {{(pid=59382) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1051.402369] env[59382]: DEBUG nova.objects.instance [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lazy-loading 'migration_context' on Instance uuid 55c244ec-daa2-4eef-8de3-324d0815026b {{(pid=59382) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1051.412206] env[59382]: DEBUG nova.objects.instance [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Trying to apply a migration context that does not seem to be set for this instance {{(pid=59382) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1032}} [ 1051.412627] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Powering off the VM {{(pid=59382) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1051.412897] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-80018d5a-1a53-40dc-909c-645de5eb668a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1051.420992] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for the task: (returnval){ [ 1051.420992] env[59382]: value = "task-2256800" [ 1051.420992] env[59382]: _type = "Task" [ 1051.420992] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1051.429671] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256800, 'name': PowerOffVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1051.931641] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256800, 'name': PowerOffVM_Task, 'duration_secs': 0.186724} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1051.932212] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Powered off the VM {{(pid=59382) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 1051.932902] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Powering off the VM {{(pid=59382) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1051.933167] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-3bd5a3ae-cd36-41f1-a3c2-847db18fea8c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1051.939424] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for the task: (returnval){ [ 1051.939424] env[59382]: value = "task-2256801" [ 1051.939424] env[59382]: _type = "Task" [ 1051.939424] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1051.947672] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256801, 'name': PowerOffVM_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1052.450313] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] VM already powered off {{(pid=59382) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 1052.450574] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Volume detach. Driver type: vmdk {{(pid=59382) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1052.450726] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-459795', 'volume_id': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'name': 'volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '55c244ec-daa2-4eef-8de3-324d0815026b', 'attached_at': '', 'detached_at': '', 'volume_id': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'serial': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128'} {{(pid=59382) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 1052.451494] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9aea6c3-8d30-428c-88d0-0adaf83a1330 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1052.469161] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-613f8b72-24d6-4939-8167-b825a4e96171 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1052.475345] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-798147a4-96f9-494a-a1cf-2f35d5c46b9e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1052.492488] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9077c8d-5215-4257-ab56-993aa635c6a6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1052.510751] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] The volume has not been displaced from its original location: [datastore1] volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128/volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128.vmdk. No consolidation needed. {{(pid=59382) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 1052.516083] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Reconfiguring VM instance instance-0000001f to detach disk 2000 {{(pid=59382) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 1052.516366] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-92ef92bc-38c4-4672-a71c-8fe40f09c0a4 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1052.533533] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for the task: (returnval){ [ 1052.533533] env[59382]: value = "task-2256802" [ 1052.533533] env[59382]: _type = "Task" [ 1052.533533] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1052.540850] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256802, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1053.043913] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256802, 'name': ReconfigVM_Task, 'duration_secs': 0.181245} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1053.044222] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Reconfigured VM instance instance-0000001f to detach disk 2000 {{(pid=59382) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 1053.048882] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-7d29fcdc-dca4-45fe-89cc-4106e78980f6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.063415] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for the task: (returnval){ [ 1053.063415] env[59382]: value = "task-2256803" [ 1053.063415] env[59382]: _type = "Task" [ 1053.063415] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1053.071037] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256803, 'name': ReconfigVM_Task} progress is 5%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1053.365023] env[59382]: DEBUG nova.compute.manager [req-9c75f369-1da6-41d9-9006-840d96a4d687 req-7b430230-3c74-45e8-a64e-e4aaede786d1 service nova] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Received event network-vif-deleted-72e7e0b8-4690-4d93-9e83-3da57109f5d6 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1053.573558] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256803, 'name': ReconfigVM_Task, 'duration_secs': 0.112855} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1053.573918] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-459795', 'volume_id': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'name': 'volume-21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '55c244ec-daa2-4eef-8de3-324d0815026b', 'attached_at': '', 'detached_at': '', 'volume_id': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128', 'serial': '21aaabc8-34b1-426a-b9f5-7f97f0fe6128'} {{(pid=59382) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 1053.574198] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1053.574930] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aea8bb2a-2c0b-4b3d-ba0a-2c01753acc39 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.581273] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1053.581484] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-eb441a8c-e7c9-4311-9168-2acb69ca5d4f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.649213] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1053.649467] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1053.649648] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Deleting the datastore file [datastore1] 55c244ec-daa2-4eef-8de3-324d0815026b {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1053.649942] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0d4a49af-d5c6-4c27-8041-fe97bd42eea3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.658758] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Waiting for the task: (returnval){ [ 1053.658758] env[59382]: value = "task-2256805" [ 1053.658758] env[59382]: _type = "Task" [ 1053.658758] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1053.666822] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256805, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1054.172977] env[59382]: DEBUG oslo_vmware.api [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Task: {'id': task-2256805, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077772} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1054.173579] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1054.173831] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1054.174068] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1054.238305] env[59382]: DEBUG nova.virt.vmwareapi.volumeops [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Volume detach. Driver type: vmdk {{(pid=59382) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1054.238652] env[59382]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2cf177c9-7379-4419-bdf5-f9d291037942 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.246949] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-923c58a9-f177-450e-aa30-cb38f3b13af9 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.275116] env[59382]: ERROR nova.compute.manager [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Failed to detach volume 21aaabc8-34b1-426a-b9f5-7f97f0fe6128 from /dev/sda: nova.exception.InstanceNotFound: Instance 55c244ec-daa2-4eef-8de3-324d0815026b could not be found. [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Traceback (most recent call last): [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self.driver.rebuild(**kwargs) [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] raise NotImplementedError() [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] NotImplementedError [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] During handling of the above exception, another exception occurred: [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Traceback (most recent call last): [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self.driver.detach_volume(context, old_connection_info, [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] return self._volumeops.detach_volume(connection_info, instance) [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self._detach_volume_vmdk(connection_info, instance) [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] stable_ref.fetch_moref(session) [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] raise exception.InstanceNotFound(instance_id=self._uuid) [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] nova.exception.InstanceNotFound: Instance 55c244ec-daa2-4eef-8de3-324d0815026b could not be found. [ 1054.275116] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] [ 1054.398848] env[59382]: DEBUG nova.compute.utils [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Build of instance 55c244ec-daa2-4eef-8de3-324d0815026b aborted: Failed to rebuild volume backed instance. {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1054.401400] env[59382]: ERROR nova.compute.manager [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance 55c244ec-daa2-4eef-8de3-324d0815026b aborted: Failed to rebuild volume backed instance. [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Traceback (most recent call last): [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self.driver.rebuild(**kwargs) [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] raise NotImplementedError() [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] NotImplementedError [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] During handling of the above exception, another exception occurred: [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Traceback (most recent call last): [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 3570, in _rebuild_volume_backed_instance [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self._detach_root_volume(context, instance, root_bdm) [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 3549, in _detach_root_volume [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] with excutils.save_and_reraise_exception(): [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self.force_reraise() [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] raise self.value [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self.driver.detach_volume(context, old_connection_info, [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] return self._volumeops.detach_volume(connection_info, instance) [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self._detach_volume_vmdk(connection_info, instance) [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] stable_ref.fetch_moref(session) [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] raise exception.InstanceNotFound(instance_id=self._uuid) [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] nova.exception.InstanceNotFound: Instance 55c244ec-daa2-4eef-8de3-324d0815026b could not be found. [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] During handling of the above exception, another exception occurred: [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Traceback (most recent call last): [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 10782, in _error_out_instance_on_exception [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] yield [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 3826, in rebuild_instance [ 1054.401400] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self._do_rebuild_instance_with_claim( [ 1054.402407] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 3912, in _do_rebuild_instance_with_claim [ 1054.402407] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self._do_rebuild_instance( [ 1054.402407] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 4104, in _do_rebuild_instance [ 1054.402407] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self._rebuild_default_impl(**kwargs) [ 1054.402407] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 3693, in _rebuild_default_impl [ 1054.402407] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] self._rebuild_volume_backed_instance( [ 1054.402407] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] File "/opt/stack/nova/nova/compute/manager.py", line 3585, in _rebuild_volume_backed_instance [ 1054.402407] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] raise exception.BuildAbortException( [ 1054.402407] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] nova.exception.BuildAbortException: Build of instance 55c244ec-daa2-4eef-8de3-324d0815026b aborted: Failed to rebuild volume backed instance. [ 1054.402407] env[59382]: ERROR nova.compute.manager [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] [ 1054.484754] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1054.485043] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1054.555174] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec1e3b90-578c-4a40-ba19-fb7b90699d4d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.563254] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aeedda8-a4e6-46af-be3a-7ab2168b52c2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.594507] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67864d79-da84-4ecf-b4bc-77c4f26471ba {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.601782] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9c56e54-7cb9-49dd-a973-d9c56a13499e {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1054.614639] env[59382]: DEBUG nova.compute.provider_tree [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1054.627913] env[59382]: DEBUG nova.scheduler.client.report [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1054.644300] env[59382]: DEBUG oslo_concurrency.lockutils [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1054.644499] env[59382]: INFO nova.compute.manager [None req-1aa98101-094f-46d5-808f-73e1aad4fbc4 tempest-ServerActionsV293TestJSON-1243774595 tempest-ServerActionsV293TestJSON-1243774595-project-member] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Successfully reverted task state from rebuilding on failure for instance. [ 1055.434287] env[59382]: DEBUG nova.compute.manager [req-dc6d3836-402a-4cee-b72d-bc1302fa447a req-2a9e5d98-f8fd-4085-93a1-f613239f1b65 service nova] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Received event network-vif-deleted-a4f52aeb-2cd7-4423-a960-0819eb5ab3a9 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1055.434483] env[59382]: DEBUG nova.compute.manager [req-dc6d3836-402a-4cee-b72d-bc1302fa447a req-2a9e5d98-f8fd-4085-93a1-f613239f1b65 service nova] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Received event network-vif-deleted-cf7c2d90-e7e6-4c8e-a1ea-abbebb909958 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1057.464993] env[59382]: DEBUG nova.compute.manager [req-8b9d831c-e93c-47f3-8959-bb33303b5b92 req-45e8f33a-1938-43d1-ab9f-29cf720f3428 service nova] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Received event network-vif-deleted-e4f99fe2-3ab8-4a44-8950-1b761bc8a7a4 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1065.453133] env[59382]: DEBUG nova.compute.manager [req-5f576be0-97f4-4f77-900f-d6cdeef53a7f req-24d0f624-0307-4124-b529-c4ff3b7ebd8f service nova] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Received event network-vif-deleted-77c63660-6775-4d77-9825-698c6ef5ea2e {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1067.479057] env[59382]: DEBUG nova.compute.manager [req-ec15be3b-c774-4aa4-8eb7-de6e05f44b9c req-9671497c-5b6e-4936-8c40-f95c3a130ed1 service nova] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Received event network-vif-deleted-4b811be5-378f-4876-aa4d-571a92ce5cf5 {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1070.301118] env[59382]: WARNING oslo_vmware.rw_handles [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1070.301118] env[59382]: ERROR oslo_vmware.rw_handles [ 1070.301118] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/d42d1721-2804-4f32-bbdf-071094a8a0d6/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1070.301118] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1070.301118] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Copying Virtual Disk [datastore1] vmware_temp/d42d1721-2804-4f32-bbdf-071094a8a0d6/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/d42d1721-2804-4f32-bbdf-071094a8a0d6/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1070.301118] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9d41ef00-b195-4f16-acbc-c2c9618c2d61 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.309788] env[59382]: DEBUG oslo_vmware.api [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Waiting for the task: (returnval){ [ 1070.309788] env[59382]: value = "task-2256807" [ 1070.309788] env[59382]: _type = "Task" [ 1070.309788] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1070.320368] env[59382]: DEBUG oslo_vmware.api [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Task: {'id': task-2256807, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1070.820793] env[59382]: DEBUG oslo_vmware.exceptions [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1070.821424] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1070.822071] env[59382]: ERROR nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1070.822071] env[59382]: Faults: ['InvalidArgument'] [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Traceback (most recent call last): [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] yield resources [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] self.driver.spawn(context, instance, image_meta, [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] self._fetch_image_if_missing(context, vi) [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] image_cache(vi, tmp_image_ds_loc) [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] vm_util.copy_virtual_disk( [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] session._wait_for_task(vmdk_copy_task) [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] return self.wait_for_task(task_ref) [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] return evt.wait() [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] result = hub.switch() [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] return self.greenlet.switch() [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] self.f(*self.args, **self.kw) [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] raise exceptions.translate_fault(task_info.error) [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Faults: ['InvalidArgument'] [ 1070.822071] env[59382]: ERROR nova.compute.manager [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] [ 1070.822980] env[59382]: INFO nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Terminating instance [ 1070.825197] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1070.825455] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1070.825724] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4d5f03c7-c750-4bca-acb3-92c28b4a6b9c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.828215] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1070.828446] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1070.829222] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-589cce63-635e-4e87-b4b6-0c02019c2a4b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.836215] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1070.838032] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-813540f8-a8e5-4454-9a47-05abc5ea2adf {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.839136] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1070.839349] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1070.840022] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-83bbb73c-5714-4b7d-abac-3243e36faec6 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.847064] env[59382]: DEBUG oslo_vmware.api [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Waiting for the task: (returnval){ [ 1070.847064] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52e8e912-b162-357f-a468-b072fe832f23" [ 1070.847064] env[59382]: _type = "Task" [ 1070.847064] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1070.855311] env[59382]: DEBUG oslo_vmware.api [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52e8e912-b162-357f-a468-b072fe832f23, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1070.920778] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1070.921479] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1070.925019] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Deleting the datastore file [datastore1] 03203308-bbd5-4adf-80a3-e851b9341f62 {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1070.925019] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3b95c8cb-8119-4e0a-a0c5-438c1f20779d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.929556] env[59382]: DEBUG oslo_vmware.api [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Waiting for the task: (returnval){ [ 1070.929556] env[59382]: value = "task-2256809" [ 1070.929556] env[59382]: _type = "Task" [ 1070.929556] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1070.937726] env[59382]: DEBUG oslo_vmware.api [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Task: {'id': task-2256809, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1071.358065] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1071.358373] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Creating directory with path [datastore1] vmware_temp/60ed5fa7-15c5-4d10-b321-74701c7ac75b/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1071.358619] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e35f5220-4968-4df8-87e7-ba0f39c1c6a8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.370223] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Created directory with path [datastore1] vmware_temp/60ed5fa7-15c5-4d10-b321-74701c7ac75b/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1071.370574] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Fetch image to [datastore1] vmware_temp/60ed5fa7-15c5-4d10-b321-74701c7ac75b/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1071.370574] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/60ed5fa7-15c5-4d10-b321-74701c7ac75b/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1071.371356] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-974b5608-9161-4e77-a240-f113a5f731b2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.378351] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3161bfb4-7661-483d-b0e8-e1e0bd207cb7 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.387229] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91f80680-97a8-4464-890c-f099c808440d {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.428345] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71fd3d9c-78e6-45d0-b7b2-2fd1447344ae {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.439840] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-287bc581-5ffd-4c6b-a6ed-1df456b65310 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.441603] env[59382]: DEBUG oslo_vmware.api [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Task: {'id': task-2256809, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065684} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1071.441838] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1071.442036] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1071.442231] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1071.442354] env[59382]: INFO nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1071.444497] env[59382]: DEBUG nova.compute.claims [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1071.444665] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1071.444874] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1071.463406] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1071.473915] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1071.474032] env[59382]: DEBUG nova.compute.utils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Instance 03203308-bbd5-4adf-80a3-e851b9341f62 could not be found. {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1071.475852] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Instance disappeared during build. {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1071.476033] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1071.476199] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1071.476359] env[59382]: DEBUG nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1071.476555] env[59382]: DEBUG nova.network.neutron [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1071.515385] env[59382]: DEBUG nova.network.neutron [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1071.520064] env[59382]: DEBUG oslo_vmware.rw_handles [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/60ed5fa7-15c5-4d10-b321-74701c7ac75b/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1071.574510] env[59382]: INFO nova.compute.manager [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Took 0.10 seconds to deallocate network for instance. [ 1071.580150] env[59382]: DEBUG oslo_vmware.rw_handles [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1071.580150] env[59382]: DEBUG oslo_vmware.rw_handles [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/60ed5fa7-15c5-4d10-b321-74701c7ac75b/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1071.632268] env[59382]: DEBUG oslo_concurrency.lockutils [None req-60b00d27-d4b3-4dc4-90be-90339ae930f3 tempest-DeleteServersAdminTestJSON-234622206 tempest-DeleteServersAdminTestJSON-234622206-project-member] Lock "03203308-bbd5-4adf-80a3-e851b9341f62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 337.047s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1074.563708] env[59382]: DEBUG nova.compute.manager [req-54ac6df4-b04a-481a-8c27-74100c9f3967 req-2842b9f7-8782-4652-9ec2-1cb71becba02 service nova] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Received event network-vif-deleted-14781ef3-1c0a-4b58-9415-43e1d4a6766f {{(pid=59382) external_instance_event /opt/stack/nova/nova/compute/manager.py:11048}} [ 1085.527115] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1085.527115] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Cleaning up deleted instances {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11145}} [ 1085.562326] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] There are 15 instances to clean {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11154}} [ 1085.562603] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 8ea743ab-33df-4834-b1e7-2ef7f1e1a147] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.598324] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 5fd51316-ab8f-4501-8389-de12a294f8da] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.633023] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: e57c71dc-fdb2-4861-b716-c6caebd6c29e] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.651262] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 55c244ec-daa2-4eef-8de3-324d0815026b] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.670210] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 05e46e58-1de8-48a0-a139-c202d77e85ad] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.688660] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 81f08c14-ee4b-4954-bf53-dc02bb600279] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.707721] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 2dfb7e00-fea7-4186-914a-98e1e5fbe49a] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.726349] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.744882] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 03203308-bbd5-4adf-80a3-e851b9341f62] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.762190] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: d3d59ff4-eaa9-46b3-8279-50e5cfe740a0] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.779664] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: cf672665-36c7-4251-a32a-537b9d4c38ed] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.797111] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: acae2ecc-9a00-4356-96d7-a7521ea46f32] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.816639] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 3c235411-c50f-40b5-a681-ca42b7838506] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.835660] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: feea4bca-d134-475f-81b9-c8415bacf1f1] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1085.854486] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: c2f5545d-884a-4166-a93b-810ef311c2e6] Instance has had 0 of 5 cleanup attempts {{(pid=59382) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11158}} [ 1087.894268] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1088.526791] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1088.526992] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Starting heal instance info cache {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9858}} [ 1088.527144] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Rebuilding the list of instances to heal {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9862}} [ 1088.537433] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Skipping network cache update for instance because it is Building. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9871}} [ 1088.537603] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Didn't find any instances for network info cache update. {{(pid=59382) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9944}} [ 1089.526521] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1089.526936] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1091.527654] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1091.528053] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59382) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10477}} [ 1091.528053] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1091.537592] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1091.537801] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1091.537963] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1091.538131] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59382) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1091.539212] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9db8b8ac-bd4c-49f2-8873-e7c09bb452d2 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.548034] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4164357-5500-481b-8553-50ab5ff40284 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.561427] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e835f5c8-6099-4b95-a9ba-2f2ce1a24fa8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.567354] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18a3738c-caab-44bf-b853-2b2e7149e444 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.595452] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181082MB free_disk=170GB free_vcpus=48 pci_devices=None {{(pid=59382) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1091.595618] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1091.595773] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1091.693029] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Instance 203e8cdb-621d-461a-97ba-3e3782f04d1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59382) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1091.693029] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1091.693029] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59382) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1091.719144] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68f5d10a-a843-49b5-b9b9-2ecc7d07031f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.726608] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-869c9149-2fb4-470f-971d-aa50c44f0819 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.756987] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e9e5e40-8c39-4ef5-ad92-0a594bf1e26f {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.763692] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cc7644c-d444-4f7a-b9a6-eabb0cc53d09 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1091.777337] env[59382]: DEBUG nova.compute.provider_tree [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1091.788142] env[59382]: DEBUG nova.scheduler.client.report [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1091.799132] env[59382]: DEBUG nova.compute.resource_tracker [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59382) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1091.799312] env[59382]: DEBUG oslo_concurrency.lockutils [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.204s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1092.793427] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1092.793952] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1094.522420] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1094.533422] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1094.533584] env[59382]: DEBUG nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Cleaning up deleted instances with incomplete migration {{(pid=59382) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11183}} [ 1094.542800] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1096.542757] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1117.231320] env[59382]: WARNING oslo_vmware.rw_handles [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles response.begin() [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1117.231320] env[59382]: ERROR oslo_vmware.rw_handles [ 1117.233583] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Downloaded image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to vmware_temp/60ed5fa7-15c5-4d10-b321-74701c7ac75b/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1117.233724] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Caching image {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1117.233843] env[59382]: DEBUG nova.virt.vmwareapi.vm_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Copying Virtual Disk [datastore1] vmware_temp/60ed5fa7-15c5-4d10-b321-74701c7ac75b/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk to [datastore1] vmware_temp/60ed5fa7-15c5-4d10-b321-74701c7ac75b/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk {{(pid=59382) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1117.234148] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-351f812a-c2eb-4c14-a78d-0036dbe3573a {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.241827] env[59382]: DEBUG oslo_vmware.api [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Waiting for the task: (returnval){ [ 1117.241827] env[59382]: value = "task-2256810" [ 1117.241827] env[59382]: _type = "Task" [ 1117.241827] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1117.249748] env[59382]: DEBUG oslo_vmware.api [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Task: {'id': task-2256810, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1117.752789] env[59382]: DEBUG oslo_vmware.exceptions [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Fault InvalidArgument not matched. {{(pid=59382) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1117.753046] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Releasing lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1117.753596] env[59382]: ERROR nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1117.753596] env[59382]: Faults: ['InvalidArgument'] [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Traceback (most recent call last): [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] yield resources [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] self.driver.spawn(context, instance, image_meta, [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] self._fetch_image_if_missing(context, vi) [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] image_cache(vi, tmp_image_ds_loc) [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] vm_util.copy_virtual_disk( [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] session._wait_for_task(vmdk_copy_task) [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] return self.wait_for_task(task_ref) [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] return evt.wait() [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] result = hub.switch() [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] return self.greenlet.switch() [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] self.f(*self.args, **self.kw) [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] raise exceptions.translate_fault(task_info.error) [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Faults: ['InvalidArgument'] [ 1117.753596] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] [ 1117.754774] env[59382]: INFO nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Terminating instance [ 1117.755433] env[59382]: DEBUG oslo_concurrency.lockutils [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Acquired lock "[datastore1] devstack-image-cache_base/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/4092f5d9-e52b-450e-8bc7-85f1a22d3b71.vmdk" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1117.755721] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1117.755966] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5eb12d93-b2c9-40cb-bd60-21fa049767fb {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.758138] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquiring lock "refresh_cache-203e8cdb-621d-461a-97ba-3e3782f04d1d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1117.758299] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquired lock "refresh_cache-203e8cdb-621d-461a-97ba-3e3782f04d1d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1117.758464] env[59382]: DEBUG nova.network.neutron [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1117.765016] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1117.765205] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59382) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1117.766325] env[59382]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a4b58467-d87f-40e7-bb29-0eabebde1cd9 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.773648] env[59382]: DEBUG oslo_vmware.api [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Waiting for the task: (returnval){ [ 1117.773648] env[59382]: value = "session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52b6bd85-f474-e317-f55b-7bb17762c67b" [ 1117.773648] env[59382]: _type = "Task" [ 1117.773648] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1117.781012] env[59382]: DEBUG oslo_vmware.api [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Task: {'id': session[52fd4a8d-0eb2-b0e1-8a21-f46eeb5ac083]52b6bd85-f474-e317-f55b-7bb17762c67b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1117.785936] env[59382]: DEBUG nova.network.neutron [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1117.866343] env[59382]: DEBUG nova.network.neutron [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1117.874787] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Releasing lock "refresh_cache-203e8cdb-621d-461a-97ba-3e3782f04d1d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1117.875191] env[59382]: DEBUG nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1117.875382] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1117.876418] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-278401e0-5683-4010-8f97-2d6c1c7b12b9 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.884921] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Unregistering the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1117.885143] env[59382]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e1960c1f-8d71-44b9-9032-c24396986740 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.921432] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Unregistered the VM {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1117.921660] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Deleting contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1117.921850] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Deleting the datastore file [datastore1] 203e8cdb-621d-461a-97ba-3e3782f04d1d {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1117.922095] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d8ae6362-e9ec-4379-990f-e02204546241 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1117.927877] env[59382]: DEBUG oslo_vmware.api [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Waiting for the task: (returnval){ [ 1117.927877] env[59382]: value = "task-2256812" [ 1117.927877] env[59382]: _type = "Task" [ 1117.927877] env[59382]: } to complete. {{(pid=59382) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1117.935403] env[59382]: DEBUG oslo_vmware.api [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Task: {'id': task-2256812, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1118.283256] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Preparing fetch location {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1118.283661] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Creating directory with path [datastore1] vmware_temp/3740f7d8-33ea-447e-89fa-70a1c305fe40/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1118.283720] env[59382]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-76ef0298-28b4-44d7-8e98-e4ff7badc06c {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1118.294747] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Created directory with path [datastore1] vmware_temp/3740f7d8-33ea-447e-89fa-70a1c305fe40/4092f5d9-e52b-450e-8bc7-85f1a22d3b71 {{(pid=59382) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1118.294940] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Fetch image to [datastore1] vmware_temp/3740f7d8-33ea-447e-89fa-70a1c305fe40/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk {{(pid=59382) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1118.295104] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to [datastore1] vmware_temp/3740f7d8-33ea-447e-89fa-70a1c305fe40/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk on the data store datastore1 {{(pid=59382) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1118.295787] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93fe8205-61e0-41ac-9e21-f58528e456ce {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1118.302081] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b770fd18-230d-4920-89c0-d1021a72c493 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1118.310656] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86f11321-01a5-41fd-83c0-f35022197226 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1118.339797] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dea419e-fbf7-418c-be03-2eb5344628af {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1118.344789] env[59382]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-007c6bb0-7c84-4501-bd80-90454e8017c3 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1118.366738] env[59382]: DEBUG nova.virt.vmwareapi.images [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] [instance: 390366c5-ced3-4ac9-9687-c5d2895fbc1a] Downloading image file data 4092f5d9-e52b-450e-8bc7-85f1a22d3b71 to the data store datastore1 {{(pid=59382) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1118.409606] env[59382]: DEBUG oslo_vmware.rw_handles [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3740f7d8-33ea-447e-89fa-70a1c305fe40/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1118.466732] env[59382]: DEBUG oslo_vmware.rw_handles [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Completed reading data from the image iterator. {{(pid=59382) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1118.466915] env[59382]: DEBUG oslo_vmware.rw_handles [None req-fd2781aa-f2cb-4754-9418-b2ea7b6c2140 tempest-ServersTestJSON-180286613 tempest-ServersTestJSON-180286613-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3740f7d8-33ea-447e-89fa-70a1c305fe40/4092f5d9-e52b-450e-8bc7-85f1a22d3b71/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59382) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1118.470379] env[59382]: DEBUG oslo_vmware.api [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Task: {'id': task-2256812, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.032768} completed successfully. {{(pid=59382) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1118.470627] env[59382]: DEBUG nova.virt.vmwareapi.ds_util [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Deleted the datastore file {{(pid=59382) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1118.470792] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Deleted contents of the VM from datastore datastore1 {{(pid=59382) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1118.470987] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1118.471184] env[59382]: INFO nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1118.471405] env[59382]: DEBUG oslo.service.loopingcall [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1118.471612] env[59382]: DEBUG nova.compute.manager [-] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Skipping network deallocation for instance since networking was not requested. {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1118.473751] env[59382]: DEBUG nova.compute.claims [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Aborting claim: {{(pid=59382) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1118.473925] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1118.474171] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1118.531642] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8e022c9-0c7b-49b0-ba10-cf7f9661ce13 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1118.538594] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41a173e0-24e3-4379-958f-ad86f1f99897 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.196813] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d1e349b-1f13-46a5-8cc2-5e0832b7ab84 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.204369] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-071eeef7-65aa-44a5-804d-a4ca06939f57 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.217118] env[59382]: DEBUG nova.compute.provider_tree [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Inventory has not changed in ProviderTree for provider: 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 {{(pid=59382) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1119.226521] env[59382]: DEBUG nova.scheduler.client.report [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Inventory has not changed for provider 0ed62ac0-b25e-450c-a6ea-1ad3f7977975 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 170, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59382) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1119.239697] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.765s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1119.240268] env[59382]: ERROR nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1119.240268] env[59382]: Faults: ['InvalidArgument'] [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Traceback (most recent call last): [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] self.driver.spawn(context, instance, image_meta, [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] self._fetch_image_if_missing(context, vi) [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] image_cache(vi, tmp_image_ds_loc) [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] vm_util.copy_virtual_disk( [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] session._wait_for_task(vmdk_copy_task) [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] return self.wait_for_task(task_ref) [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] return evt.wait() [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] result = hub.switch() [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] return self.greenlet.switch() [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] self.f(*self.args, **self.kw) [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] raise exceptions.translate_fault(task_info.error) [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Faults: ['InvalidArgument'] [ 1119.240268] env[59382]: ERROR nova.compute.manager [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] [ 1119.241083] env[59382]: DEBUG nova.compute.utils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] VimFaultException {{(pid=59382) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1119.242211] env[59382]: DEBUG nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Build of instance 203e8cdb-621d-461a-97ba-3e3782f04d1d was re-scheduled: A specified parameter was not correct: fileType [ 1119.242211] env[59382]: Faults: ['InvalidArgument'] {{(pid=59382) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1119.242575] env[59382]: DEBUG nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Unplugging VIFs for instance {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1119.242809] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquiring lock "refresh_cache-203e8cdb-621d-461a-97ba-3e3782f04d1d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1119.242960] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquired lock "refresh_cache-203e8cdb-621d-461a-97ba-3e3782f04d1d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1119.243127] env[59382]: DEBUG nova.network.neutron [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1119.264971] env[59382]: DEBUG nova.network.neutron [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1119.321134] env[59382]: DEBUG nova.network.neutron [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1119.329010] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Releasing lock "refresh_cache-203e8cdb-621d-461a-97ba-3e3782f04d1d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1119.329224] env[59382]: DEBUG nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59382) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1119.329398] env[59382]: DEBUG nova.compute.manager [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Skipping network deallocation for instance since networking was not requested. {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1119.408717] env[59382]: INFO nova.scheduler.client.report [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Deleted allocations for instance 203e8cdb-621d-461a-97ba-3e3782f04d1d [ 1119.423716] env[59382]: DEBUG oslo_concurrency.lockutils [None req-7fa04e78-f121-4d82-bff5-c62e43e3b617 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Lock "203e8cdb-621d-461a-97ba-3e3782f04d1d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 336.061s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1119.423960] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Lock "203e8cdb-621d-461a-97ba-3e3782f04d1d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 139.952s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1119.424189] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquiring lock "203e8cdb-621d-461a-97ba-3e3782f04d1d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1119.424388] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Lock "203e8cdb-621d-461a-97ba-3e3782f04d1d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1119.424555] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Lock "203e8cdb-621d-461a-97ba-3e3782f04d1d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1119.426558] env[59382]: INFO nova.compute.manager [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Terminating instance [ 1119.428085] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquiring lock "refresh_cache-203e8cdb-621d-461a-97ba-3e3782f04d1d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1119.428249] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Acquired lock "refresh_cache-203e8cdb-621d-461a-97ba-3e3782f04d1d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1119.428416] env[59382]: DEBUG nova.network.neutron [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Building network info cache for instance {{(pid=59382) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1119.453282] env[59382]: DEBUG nova.network.neutron [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1119.511145] env[59382]: DEBUG nova.network.neutron [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1119.520873] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Releasing lock "refresh_cache-203e8cdb-621d-461a-97ba-3e3782f04d1d" {{(pid=59382) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1119.521294] env[59382]: DEBUG nova.compute.manager [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Start destroying the instance on the hypervisor. {{(pid=59382) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1119.521479] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Destroying instance {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1119.521947] env[59382]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6e04f304-9866-44e9-af13-2e36a639e5ca {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.530872] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a427910f-f43b-4217-9d1e-5d846ad2099b {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1119.557555] env[59382]: WARNING nova.virt.vmwareapi.vmops [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 203e8cdb-621d-461a-97ba-3e3782f04d1d could not be found. [ 1119.557681] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Instance destroyed {{(pid=59382) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1119.557941] env[59382]: INFO nova.compute.manager [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1119.558077] env[59382]: DEBUG oslo.service.loopingcall [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59382) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1119.558280] env[59382]: DEBUG nova.compute.manager [-] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Deallocating network for instance {{(pid=59382) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1119.558390] env[59382]: DEBUG nova.network.neutron [-] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] deallocate_for_instance() {{(pid=59382) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1119.574961] env[59382]: DEBUG nova.network.neutron [-] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Instance cache missing network info. {{(pid=59382) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1119.581552] env[59382]: DEBUG nova.network.neutron [-] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Updating instance_info_cache with network_info: [] {{(pid=59382) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1119.589340] env[59382]: INFO nova.compute.manager [-] [instance: 203e8cdb-621d-461a-97ba-3e3782f04d1d] Took 0.03 seconds to deallocate network for instance. [ 1119.665899] env[59382]: DEBUG oslo_concurrency.lockutils [None req-b20175bb-4484-490e-a0e7-69364690b694 tempest-ServerShowV254Test-1579840177 tempest-ServerShowV254Test-1579840177-project-member] Lock "203e8cdb-621d-461a-97ba-3e3782f04d1d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.242s {{(pid=59382) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1137.764879] env[59382]: DEBUG oslo_service.periodic_task [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59382) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1137.773393] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Getting list of instances from cluster (obj){ [ 1137.773393] env[59382]: value = "domain-c8" [ 1137.773393] env[59382]: _type = "ClusterComputeResource" [ 1137.773393] env[59382]: } {{(pid=59382) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1137.774441] env[59382]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd376a64-4b6d-4bf7-ba9c-b731fac2e2e8 {{(pid=59382) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1137.789055] env[59382]: DEBUG nova.virt.vmwareapi.vmops [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] Got total of 7 instances {{(pid=59382) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1137.789220] env[59382]: WARNING nova.compute.manager [None req-aa7ecc92-9cb7-46eb-87c8-739e23dc21d1 None None] While synchronizing instance power states, found 0 instances in the database and 7 instances on the hypervisor.